Wednesday, November 17, 2010

Euro Trip : Paris in a day

Catacombs tour, walk up Montmartre, then scoot over to the Tower and walk to the Louvre via the Arc. That's pretty much Paris in a day, though there is Versailles left, but that's not technically in Paris.

We had lovely, unforecast sun. I may have even got a little burned.

Tuesday, November 16, 2010

Euro Trip : To Paris, and everything since the last one

Thus begins our journey to Paris. Our last early morning rise to catch a 6:30am train from Berlin to Karlsruhe, sit around for an hour or so at noon, then trundle on board another train for the remainder of the ten hour trip.

The last entry covered leaving Venice, and heading for Munich. The thermals came in very handy. On our first afternoon, we went for a wander to a camping store, to pick up some more socks for me, passing by the university and a few monuments. The biggest thing to get used to was organized traffic and cyclists that shared part of the footpath with pedestrians. They wore no helmets, which was a little off putting, but then again, no one was traveling at breakneck speeds and they generally kept single file.

The next day was a bus ride to Neuschwanstein (think new-swan-stone, but replace the w with a v; that is also the literal translation) and Lindehoff (there could be an umlaut in there). This was my second trip to the stein, but is was no less impressive. The tour was with the standard Grayline coach tour, but the tour host was able to off extra information about Munich, Bavaria, the castles and their designer, the "Mad" King Ludwig II.

I tend to think of Ludwig as the Sad king. Bought to power as a king too young (18yrs), his role was a disappointment to him, as he was not a true king (as was Louis XIV, who he greatly admired) and had to share power with a parliament. Already being painfully shy, he set about doing what he really wanted to do, build castles and keep to himself.

Lindehoff was the only castle completed in his lifetime. It was a homage to Louis XIV, and only showed images of the Sun King. It was also quiet small, meaning only to house Ludwig and his servants. I could go on and on, but Google is your friend, and check it out if you have the chance. It seems the rest of Germany view the castles as a place to visit for Japanese tourists.

Ludwig died a "mysterious" death with the doctor that had declared him as mad, before he had met him. My guess is that once the doctor did meet him, and perhaps concluded that he was not mad, they were both killed, but it was made to look like some arranged murder/suicide via drowning (apparently they both drowned, by the side of a lake). Modern forensics could probably deduce a less mysterious conclusion. It was a sad ending to a 700 year rule by one royal family over Bavaria.

Also, while in Munich, I did spy the hotel that I stayed in, some 7 or so years ago. Hotel Stachus was still there, now flanked by a KFC and a department store. The character of that part of Munich seemed to have changed, with a vest increase in tourists, beggars and people going about their business. Or may they just weren't piled in the tents of the Oktoberfest grounds this time.

The next port of call was Bremen. The wad a very nice two day stay, as we stayed with distant relatives of Annika. It turns out that in former times there were three families of Treuels located in the Bremen area on the banks of the river Elb(?), although no link has been found between those families. Also, the actual meaning of Treuel is a little hazy, too, but it is thought that it might be related to the job of pulling ships up the river. Given that Bremen was such a major port, there may be no relationship between the three families except for the job common to them. DNA tests could probably determine any blood relationship, but since the Australian Treuels keep sending fakes to this particular family, it probably doesn't matter. The German pronunciation of Treuel would give Australians even more issue than they already have. Treuel often gets transcribed as Trevel or Truel. If you roll your Rs, and say troil, but with a really short O, you'll get close to how it is normally pronounced. It's almost one syllable.

The Treuel family are wonderful, generous people and showed us around Bremen and some of the surrounding areas. We had a wonderful dinner at a restaurant at the end of a dyke path, and were treated to a very fun interactive museum of natural history called the Universum Bremen. 

Our next stop was Berlin. With only one full day in Berlin, we aimed to make the most if it with walking tours from Alexanderplatz(?) to the Brandenburg Tor, and side trips out to Sachsenhausen(?) and Checkpoint Charlie. Wombats is a great hostel, but the clientele are quick to forget that not everyone is out and about at 2 and 5am, especially that group of noisy boys staying on level 2. So we has a bit of a late start, but a mostly fine day, exploring some of Germanys grim history. Small tip for fellow travelers. The S1, north, seems to have some track works between Frohnau and Birkenwerder, so you may wish to grab a cheap Sunday all day pass and catch a PEG from Litchenburg(?) to Oranienburg instead. Or, do the bus transfer available, instead. It was a bit confusing for the non locals, like us.

After the grimness of Sachsenhausen, Checkpoint Charlie was not as we had expected. There is a checkpoint still there, with two soldiers posing for souvenir photos. However, the rest is a boarded paneled wall, telling the tales of the walls construction and attempted escapes, blocking what could be construction of a proper memorial. There was no portion of original wall to be seen, but it was dark and perhaps we were looking in the wrong spot.

One tour pamphlet we were reading was saying that Berlin was not pretty, constructed with concrete slabs. Take away the German signs, and we could have been in parts of Brisbane, Sydney or Melbourne. Maybe it was because of the relative newness of the city compared with other parts of Europe that are still sporting their ruins from older times.

We strongly recommend the Vietnamese restaurant across the road from Wombats. It's top quality and very reasonably priced, with an entree and main coming to about €10 each, excluding drinks.

I'm glad Paris is the second last stop for us, and that we are there for a few days before moving on. All this train travel makes me feel  like I'm moving, even when I'm not.

Tuesday, November 9, 2010

Euro Trip : Good day for ducks

Our first full day in Venice started wet. It's a bit of an interesting thing, over in the Dorsoduro district. We're actually staying near the border of that district, in the Castello area. When you pass over into Dorsoduro, the combination of rain, high tide and wind mean that the place goes under water, especially around the Piazza San Marco. We had started doing a walking tour of the area, with the help of the Lonely Planet guide book for Italy, and had got just a few hundred meters past the piazza, when we decided to back track and pick up some gumboots we had seen, going for €15 each. Most vendors had them going for €20.

So back to the hotel we went, dropped our shoes, and started again, with the comfort of dry feet.

The walking tour was a showcase of churches and historically significant buildings, but since all we had was the route transposed on to a tourist map (with street names!), it was nice to just follow the walk ways and see the sights without getting too lost. And even if you did get lost, there are always signs for San Marco, the Rialto, Piazzale Roma and the Ferrovia.

Time certainly does pass quickly on one of these walks. After our false start, it was already 10am. We ended up at the termini at about noon, to make use of the guaranteed, but pay for use WC. It was about 1pm when we seriously considered getting something to eat, and 2:30pm before we actually did.

Venice really is quite pretty, and easy to spend the whole day wandering around, getting lost. For a cheap stay, I'd recommend off peak, two nights and be prepared to buy gumboots. You could probably spend another couple of days actually going into those old buildings, seeing glass being blown, and taking a gondola ride when the afternoon warms up.

Tomorrow is the transit day to Munich. Hoping Travellex pull their finger out with the transfer I made last Thursday, so I can get some clean socks when we get there.

Monday, November 8, 2010

Euro Trip : Via per Venezia

Today was our transit day from Cinque Terre to Venice. I think it's the most trains I've ever caught in one day. Riomaggiore to La Spezia, then to Pisa, then to Florence, then to Venice. The day started a bit cloudy, but we didn't get steady rain until getting to Venice.

The first bit of fun was finding our hotel. First, the was the ferry service to catch to S. Zaccaria, which is kinda not sensible walking distance from the termini, with packs. Next we realized that the printed Google map, and the little tourist map were very sparse on street names. And Venice had many many streets, none of which are easy to remember, or get to. A GPS would be very handy. We finally made it to our hotel, just as the light had been replaced by darkness.

After dumping our bags, and checking emails we headed out for a quick wander to see if we could find the efficient route to the hotel that we should have taken in the first place, and to grab a bite to eat.

Being on the move all the time is starting to take its toll. I'm thinking about wearing this T- shirt again, for the second day, and I'm looking forward to Munich, in two days time, where I can pick up another pair of walking socks (or two).

I guess we're at the hump of our holidays now. In two Sundays time, we'll be catching our flights back to Australia. With all the excitement in the news regarding the A380s, it will be interesting to see if we actually depart on time.

Oh, we found Twisties. They're called Fonzies! Ayeeeeeee!

Saturday, November 6, 2010

Euro Trip : Thieving Bastards

On the morning of leaving Florence for Pisa, and then the Cinque Terre, I did a quick funds check to see how much we had left on our Travellex card. When I logged into my Internet banking account, I noticed $9000 in charges that we weren't expecting.

It would seem that from the day we flew out of Australia, persons unknown had decided to use our credit card details to book flights and accommodation throughout the UK and Europe. A short call to the credit card services and we got our cards cancelled. They did offer to send an emergency card to us, but because we're on the move, we declined.

The Travellex is enough for Euros, we can just make sure we have enough cash when we get back into London. All that remains is for me to contact the fraud department when we get back to Australia, and we can go through which transactions are legit, and which are bogus. We should also have replacement cards waiting for us when we get there, so we can re- establish standing orders, like mobiles, Internet, private health insurance, charities, etc. We'll do a few spot checks for bills online when we get to some place where we feel a bit safer letting our minds wander into the websphere, without having to be too mindful of our immediate surroundings. Even as I write this, traveling on the train from La Spezia to Pisa, I keep a watchful eye on people wandering the carriages.

Euro Trip : Cinque Terre becomes Duo Terre

With our little bit of excitement regarding our credit cards, we were a little late to follow through on our plans to visit the leaning Tower of Pisa, on the way to Cinque Terre. So we stopped in Pisa long enough to pick up some lunch and catch our connecting train to La Spezia, were we would catch another train to the first town in the Cinque Terre group, Riomaggiore. I'm still not sure how to pronounce that properly.

We had heard that you have to pick up passses for the Cinque Terre track in La Spezia, so Annika made some initial enquiries, while I waited on the platform with the bags.

I could see her talking to the info officer. She face palmed. That's not good. Then she rushed out and picked up a couple of tickets from next door, and we rushed to platform 7, to make the next train to Riomaggiore.

It turns out that only one of the tracks that connects each town to the next is actually open. All the rest are closed, due to bad weather. When we arrived, we could see a sign saying that the track between the second town and the third town was closed, due to fallen rocks. And than there was a separate sign, with everything from the third town to the last town being closed due to bad weather. That was a bit disappointing.

However, we got a beautiful room to stay in, in Riomaggiore. It over looks the little harbor, though is not quite visible in all the classic photos of that town.

We decided we would see if the track was open the next day, and if not, we would have our trip to Pisa and take our chances with the whole track possibly being open on our last full day in Riomaggiore.

Well, we're on the train back to La Spezia, having seen the Tower and even walked up it. Holy cow, what an experience. Between the lean that was setting my inner ear on full tilt, and the indents in the marble steps, caused by 800 years or so of traffic, it was a bit of a fun house, minus the fun. We made it to the to the top, but with much trepidation. I think there would even had been cause for concern if the lean was not there, but the lean didn't help. All the smiles in our photos were nervous ones.

I should mention the lovely restaurant in Riomaggiore. La Lampere(?) has lovely seafood, and the regular range of pizza. I had a very nice oven baked Sea Bass with potatoes and tomatoes. Annika had a yummy scampi pasta. The waiters seemed to cover three generations from the same family, and the music was... choice, with selections from something you might hear in a classical fine dining restaurant, to Lenny Kravitz and Green Day.

Tonight will be cheap eats, with take- away pizza and French fries. Tomorrow, we will check out the track, hope more of it opens, and travel between by train if not. There's usually enough to explore in each town, with the non park track, if Riomaggiore is anything to go by.

Euro Trip : Leaving Napoli

It's been a few days since we left the sanitary disaster that was Napoli. Something that I failed to mention in previous posts was that Napoli was a mess because the local government could not come up with a good waste management plan, and was prone to using whole towns as the dumping site. Somehow, the mafia is involved as well.

Anyway, next stop was Firenze, or Florence, as it is printed on English maps. Firenze is a big change from Napoli. Streets are clean, walls are relatively graffiti free, and the place is packed with tourists, trying to navigate to the next sight seeing spot.

It's a pity we only had one afternoon and one night to explore. Then again, there's only so much looking at old buildings you can do. Ruins hold much more interest for me.

We made the most of the day with a walk to most of the major local sites. We missed out on the palace gardens, though. One hour is not enough for the ticket sellers to make it seem worth your while.

That night, we had a wonderful fancy meal at the Osteria Pepo, on Rosina. If you get enough meat eaters together, you may want to try the beef steak Florentine style. At a minimum purchase of 1kg, and sold by the 100gm chunk, you'll want at least two people who don't mind their meat still mooing at them. Failing that, the fillet with green pepper sauce (the non capsicum variety) will do the trick, or the thin slices with parmesan and rocket.

The late afternoon markets are pretty exciting. I tend to skip markets, but the goods seemed pretty good, with lots of leather and wool goods actually made in Italy. There's also a movement to prevent the sales of cheap knock offs, so whatever you buy will be genuine something.

The next day, tragedy.

Wednesday, November 3, 2010

Euro Trip : Napoli (redeeming features)

One of Napolis redeeming features is it's food, and specifically La Brace, which is just around the corner from the hotel. I did read that you would have to go out of your way to find bad food, or simply stop at the first McDonalds you see from the termini. Regardless, La Brace came through for us on our first Sunday night in Napoli, and after having eaten at a couple of places were the menu changes when you sit down, or not everything is available (no pizza at a pizzeria!), it has saved us again after a full day of wandering the Italian country side.

So, the wandering...

Well, first we rewind one day. Yesterday was an organized tour of old Pompeii and Mt Vesuvius. Very nice, quite safe, except for those high winds at the top of the mountain. All in all, a no risk day.

However, today was the "choose your own adventure" day, in which there was a general plan to visit some Greek ruins and the Amalfi Coast. So, we took a train down to Salerno, courtesy of our Eurorail pass. Then jumped on a bus heading out to Paestum, the location of some Greek ruins. It was already a bit cloudy, but after we jumped on the bus, it started pissing down. Oh well, there's no bad weather, just inappropriate clothing (that's a mantra useful on most occasions, except hurricanes). The rain toned down a bit, by the time we hit Paestum, but rain coats and brollies were handy.

After a wander through the museo and the ruins, and a quick bite to eat at a pizzeria with no pizza, we decided the rain was too much to make the bus ride to the Amalfi worth it. Besides, we had to make it back to Salerno first!

We learned two things, which ended up being one thing, before our journey back to Salerno was done.

The first was that train tickets are not always available from the train station. We didn't initially plan to return via train, but it was an option one trying to board one bus didn't quite go to plan. "Back" was the word, and we thought that meant "catch the bus from the other side of the road". Turns out, not so much, but we decided to try for the train station instead, where a sign welcomed us to buy tickets from a bar, a tobacconist or some other corner shop.

After one more bus going passed was able to confirm that we were standing on the right side of the road all along, we were finally informed by the next bus driver that we could not buy bus tickets on the bus. He was good enough to stop at the next available and open bar, for us to buy our tickets, and waited for us to run back. Our first bus out to Paestum must have been with a different operator that did take payment on the bus.

Bus tickets seem to be purchased for journey time, rather than the journey itself. Luckily, they're fairly cheap, with 140 minutes of travel, only costing €3.

By the time we got back to Salerno, any attempt to make it to Amalfi would have us arriving after dark. A bit of a pity, since the rain had cleared up quite a lot, and the sun was in full shine while we sat outside the train station, waiting for the next bus to Salerno.

Tomorrow, Florence.

Apologies to Facebook commanders commenters. I'm doing everything via iPhone, at  the moment (and burning through the souls of innocent kittens while doing so, apparently), and the Facebook app doesn't let me view comments on notes. Maybe I'll pull my finger out and look up the  comments via Safari.

Updated for formatting - 25/11/2010

Monday, November 1, 2010

Euro Trip : Napoli (first impressions)

I have to place a lot of caveats on this impression, since we've only been here for two hours and fifteen minutes. It's Sunday afternoon, most shops are closed, except for a few eating places. The streets are lined with hawkers selling bags, shoes, shirts and belts. Where there is nothing being sold, there is rubbish. Large skips, overflowing with rubbish. Maybe it's because it is the end of the week. Maybe all this stuff gets cleaned up on Monday.

The piazza Garibaldi, outside of the main terminal, is a construction zone. After checking in, we went for a little two hour walk to the Duomo. The piazza Nicola Amore, along the way, also a construction zone.

We haven't been down to the water front yet. I'm not holding my hope.

My first impressions of Napoli: a bit of a shit hole.

Euro Trip : Rome'd Out

Today is the second full day in Rome, and thankfully, the last. I'm Rome'd out. We both are.

Yesterday was the Colosseum and Palatine Hill via a very good tour Dark Rome via ViaTour), and today was the crypts via the same group. Those were the official tours.

The unofficial tours were the stretch of road between the Termini and the Colosseum, twice (morning tour was moved to the afternoon due to strike action); a glimpse of Vatican City (Saturday morning is not an ideal time to visit due to hordes of children); the Trevi fountain and the queue to activate our Eurorail passes and reserve seating for our journeys through Italy (keep a spare hour or two up your sleeve for that one).

This may all sound very negative, but you have to look hard for the silver lining. I guess Rome is a big city. It's guided chaos and attracts tourists, pilgrims, and those that prey on those types. It's hard to find a nice quiet spot to eat a cafe bought panini that doesn't smell of piss.

However, the people are friendly for the most part, and the Hotel Kennedy ( where we are staying) is very nice, especially since it is in the heart of the tourist zone.

Tonight we had a sit down meal at one of the local trattoria. Sure, it was a little touristy, but the servings were generous and we didn't feel as though we were gouged when we got our €40 bill. The guided tours were really good. There's so much more to learn from a guided tour than you would get from wandering around yourself, or even one of those phone tours. Our crypt tour took us to two places we weren't even expecting to go.

Tomorrow, we're off to Napoli. I think it's going to be nicer than Rome. Mostly that's down to the accommodation being very close to the train station, and our Pompeii and Mt Vesuvius tour picking us up from the hotel.

Updated for formatting and Dark Rome link, and I was dead wrong about Napoli - 25/11/2010

Euro Side Notes

Euro Side Notes shall be a collection of short tails that have happened, but haven't made it into the main blog stories for that day.

At the moment, we're waiting at Terminal 5 (Heathrow) for our flight to Rome. We had to get up at 5am, to catch a mini cab at 5:30, to get us to the airport 2 hours before our 9:10 flight. Check-in procedures at Heathrow are really bloody efficient, such that we've been waiting for our gate announcement since 6:45. Oh well. Security checks are still on the stupid scale, with Annika having to remove her boots and jacket, and jump on a small box to be scanned.

Today, I left my faith Macpac of 11 years at the Euro Lodge. On our first day of arriving, the plastic that holds the main straps to the light aluminum frame broke. I guess it happened as a result of exposure to the corrosive salt of Vanuatu. Anyway, that was enough to spur me to get a new one. It's a bit of a pity because just about everything else about the pack was still good. One of the zips needed a minor repair.

So on our first day, trying to stay awake until we could check in at our accommodation, we went for a wander to Oxford Circus and the surrounding area. After visiting the British Museum to see the Egyptian exhibit, we went back to a Kathmandu we had spotted earlier.

From there I picked up a nice 75L pack. The day pack isn't as good as my previous one, but being able to hold a bit less, it ought to be lighter. Doh! I've actually kept the older day pack, but packed it in my main pack. That will be getting swapped over when we get to Rome. Just another 15min until the gate is announced.

The airport has a couple of apparent free wi-fi networks, but neitherseem to work. So in the mean time, I'm writing my blog entries to the Notes app on my iPhone, and will email them on when I can get a connection.

Thursday, October 28, 2010

Euro Holiday : Day Two : London, again

A much more pleasant day for day two. A balmy 14 degrees and some rain to start off, and a continental breakfast that made you glad there's a five pound, all day breakfast joint just a few doors down.

The Tower of London was the first on the list, thanks to the recommendations of friends at dinner, last night. The sun made a surprise guest appearance and stuck around for an encore, than invited everyone to the all afternoon after party. Awesome!

The afternoon was mostly spent lining up for the London Eye, but it could have been worse. We could have been stuck behind those noisy French kids who seemed to think everyone would enjoy their raucous shouting and impromptu dancing. And it could have been raining.

We've lasted until 4 pm today, before heading back for a little rest and prep, before heading out to Soho this evening for either a comedy show or a Chinese meal (hmm, aromatic duck, baby!).

Tomorrow, our mission is to make it to Rome. It will be an early start, but at least the flight will be shortish, and we'll have a mini cab booked to get us to Heathrow.

Wednesday, October 27, 2010

Euro Holiday: Day One

Finally showered after walking around a very wet London since we dumped our bags at the accommodation at 8:30 am. We had touched down at Terminal 3 at 5:30 am *yawn*.

The plastic bits in my backpack had perished since the last time I hoisted it on to my back. I think that might have been at Vanuatu at the start of this year. For an eleven year old pack, it's had a good life but it won't make it back to Australia with me. Cheers, Kathmandu, for the nice discount on the pack.

It's very strange to be back. After five years of being away, little things flit back. Like bus numbers.

So tempting to nap before heading out to dinner with/at friends.

Maybe I'll succumb for a little while. It's 4:12 pm, which is more like 2 am, Brisbane time. There's only so much that shitty eight hours of broken sleep is going to do to get you aligned.

Monday, September 20, 2010

In which my old Billion 7402G becomes a Wireless Access Point

Currently (or at least an hour or so ago), my home network was powered by a Billion 7404VGO. Luckily, the previous home owners who did the renovations also had good sense to wire the place for Ethernet to several positions in the house.

And that's a good thing, because the main panel for that wiring is stuck in one corner of the house, the study, and that makes wireless access to the rest of the place, including the veranda, a little bit flaky.

So tonight, I hooked up my old Billion 7402G to act as a Wireless Access Point in the living room, where most things wireless are doing their accessing. This will mean a stronger signal for wireless devices on the veranda as well. This is a little note on how I did it.

Firstly, I took a note of my existing subnet and gateway. 192.168.66.254, nice a memorable. I then unplugged my computer from that and hooked it into the old router, and waited until the DHCP came up with something useful. It was on 192.168.1.X, so I pointed my browser to http://192.168.1.254/ and luckily I could still remember what the old password for the web interface was.

Now, I'm not sure that I did the next bit right, but it worked out in the end. First, I turned off DHCP Server, and set it to DHCP Relay, and pointed to the existing router at 192.168.66.254. Next, I changed the address of the router to 192.168.66.253. At that point, I lost the connection to the router, because it's IP had changed, and there was no further DHCP set up on it to provide a proper IP address to my computer. A small voice in my head mentioned that probably wasn't the smartest move, and luckily I just do programming, not any type of network support.

So I plugged my computer back into the network, and plugged the old router into the new one. I then accessed the old router via http://192.168.66.253/ and saved the config!

Now that I could access the old router via the new router, via the browser, it was time to swap the wireless part over. I disabled the wireless on the new router, and enabled the wireless on the old one. I made sure to keep the SSID and password the same, under the same type of encryption. I then used my iPhone to make sure that side of things were all good.

After doing one more Save Config, I moved the old router into the living room, and plugged the cable from the router into the wall where the TiVO used to go, and plugged the TiVO into the router. I then checked the DHCP table on the new router, but couldn't see an entry for the TiVO. As it turns out, the TiVO isnt connected to the network all the time, so I used Annikas laptop to test the router.

First, I made sure the wireless was still working. Well, that was a problem. The wireless seemed to be working, in that there was a connection and the signal was strong, but there was no internet access. I knew that I wanted some sort of bridge, but I couldn't find any solid documentation on it, so I put the router into 1483 Bridged IP LLC mode, with NAT enabled. Now, since I've got nothing actually plugged into the WAN port, it shouldn't make a lick of difference, but somehow, it did. The internet became available to the laptop, via wireless.

I then disabled the wireless cared, and plugged the laptop in to the router via cable, and that worked too. I should note that the, now wireless, router showed no DHCP Table, nothing in the NAT Sessions and nothing in the Wireless Association. Actually, looking at the Wireless Associations now, it has two entries, but the IP Address is unknown and the MAC is populated. These are probably the house hold iPhones and I think the unknown IPs are because the DHCP is handed by the newer router.

If anyone does know how to set up a Billion 7402G as a wireless access point properly, please feel free to drop me a line.

Thursday, September 16, 2010

Last nights grading

I had another Tae Kwon Do grading last night. This one was progressing from my third blue belt to my first red belt. The grading requirements are fairly simple, just form and sparing, but they must be of good quality to deserve the entry into red belt. This is the third grading I've had that has required sparing, so I didn't get nervous about it until yesterday, and I thought I was fairly well prepared.

The form went as well as it could. The school hall in which we do our gradings is some synthetic bouncy material that can be a little too slippery, for my liking. I rely a little on the floor having some grip for my feet so I can make my stance changes and moves swift, and yet have power. That comes a little bit of a problem when your foot might slip out at any moment.

Then came the sparing. Sparing opponents were a little thin at the "older guy" end of our belt level. There were just four of us, one of which trained at my club, another that I'd sparred when going for my second blue, and the other that I'd sparred when going for my third blue. Both of the likely opponents I had history with.

The first I had knocked over when going for second blue, by putting up a kick that lifted his already raised leg, and off balanced him. When we were warming up when going for our third blue, he looked like he had been practicing his spinning side kicks a lot.

The second I had accidentally landed a glancing blow off his upper chest into his jaw when we were sparring for our third blue. I lost a point for that infraction, since there is to be no punching in the face at gradings, but I still got a pass overall for the grading. Punching in the chest is something I've been doing in training when I get too close to land a good kick, or I get really tired. Though this time around, I wasn't going to let either of those things happen. I should mention this guy is an ex-boxer, and very very solid.

This time around, I was paired with the second guy, and we even had the same ref. I wonder if he recognised us?

I only remember giving two kicks, maybe blocking one and then smelling sweaty glove and wondering what was causing the pain at the back of my neck. And then wondering what I was doing on the floor. Oh yeah, I copped a punch in the face, and lost a little bit of memory for the second or so before it happened.

I'd like to think it was a good solid punch. It landed on my top lip, because that was all feeling pretty numb. It must have just slid under my nose, because that was bleeding and there wasn't the shooting pain and watery eyes that usually comes from such I thing. It was right in the centre of the lip, so it definitely wasn't glancing. And then there was the pain in the neck. I guess any punch in the face is going to see a bit of whiplash movement in the head.

Amazingly, I was fine! Well, I was at that point. That was probably the adrenaline kicking in, or just numbing everything. There was no blood, my mouth guard and the cushioning in his glove had done their job.

I got up. The ref asked my if I was okay. I was thinking, "I don't know. Shouldn't you do one of those pupil dilation tests?". Instead, I just nod my head enthusiastically. I'm not much of a talker with the mouth guard in. I was expecting a bit of "how many fingers am I holding up?". Instead, I get asked to breathe. So I do. I think there's a dopey grin on my face at the time. And then someone from the judges panel suggests I do proper breathing, so I do the big arms up, out to the side and in breathing exercise that we do at TKD to show I'm good in the breathing department, and I can still follow instruction.

So with that done, we're back in for round two. It was really short. I think I manage to get 3 more kicks in, two of which are definitely head height, and then it was over. I was just about to launch a spinning kick as well, as is all that is left to do when up against an opponent of my size and is an ex-boxer. I seem to recall a similar thing happening last time, when I realised that the front kicks I was landing weren't going to do much to move him back.

My trainer checked with me as I was putting my gloves and shin guards away. He said something along the lines of me catching a downward chip. Despite the lack of broken skin, and that I was otherwise feeling fine, I don't think it was a chip. And landing where it did, it sure wasn't glancing off anything else.

It wasn't until I was half way driving home, did the possibility of a head ache present itself. I popped a couple of paracetamol when I got home, and relayed the story to my girlfriend. She's a fourth Dan in TKD and fairly familiar with the sparring routine. Apparently, it's par for the course to make the second round really short, just to make sure the knocked down opponent can get back up and get back into it.

Today, my neck is a little sore and my nose is a little sore to touch. I guess it's all connected, and maybe my nose got buried in the glove padding afterall, just not enough to make contact with the bone.

I'll find out next Wednesday if I pass or not. Hopefully, I will.

Thursday, September 9, 2010

Christchurch Earthquake

I was born in Auckland, though most of New Zealand childhood that I remember was spent in Christchurch until I moved to Australia when I was ten.

The recent quakes experienced in Christchurch are changing the face of a city that I've only had the pleasure of visiting a few times since I left 25 or so years ago.

For outsiders to get a good idea of where the quake occurred, and the fear that a lot of Canterbury dwellers are experiencing with these aftershocks still happening, take a look at the Christchurch Quake Map.

I hear no hobbits were harmed during the quakes, though that's mostly because their homes are located in the North Island, and they're not real.

Saturday, September 4, 2010

Like a no-ob, tweeted for the very first time...

Sometime last week, I signed up for a Twitter account. Twitter's been around for so long, but I've been slow to adopt. And truth be told, I'm likely to be seldom to use. I wasn't really sure what would make me tweet, why I would tweet or even when I would tweet. But events conspired against me today, and I finally realized what it(Twitter) was for.

I've got several social networking gizmos floating around the net. I've got three blogs, which with a different purpose. I've got Facebook, which I use for friends news. Stuff posted to FB has either come from my public blogs, or is something for friends. I've got Google Buzz, which is good for stuff shared by friends on Twitter, Flickr, blogs (mine specifically), reshared by Google Reader, or even just original content. Even though Buzz is a free app with the Google domain, usually the only people to notice Buzz are those who have opted in to it, and have it as an option in their Gmail.

In summary, Facebook is for friends, Buzz is for Buzz friends, blogs are for everyone, and Twitter is for everyone in 140 characters or less at a time.

Now I must change the topic a little, and talk about today. Today started out good. I got up, had a shower, had a bit of breaky and greeted the day at about 10am or so. The greeting started with a mowing of the lawn. It's in various states of repair, and the new lawn, which has been sown by hand, was looking like it needed it's first mow. There was also alot of leaves on it that were starting to cover the bits of dirt that were struggling to spring forth with the grass, so I thought a bit ow mowing with the mulcher would break up those leaves and promote a bit of growth.

After the mowing, I hauled the garden bag to the front of the yard. It was getting a bit full, and is due to be picked up sometime in the next two weeks. I'd gone a bit nuts with pulling weeds out of the lawn from the previous week.

Then there was a mattress to cover and move into storage under the house.

And finally, a scrubbing of the front deck in preparation for an oiling. It's actually a bit more involved than a standard scrubbing. First, there's the nail punching, then the sanding, then the hot water and soap scrub, then the hosing, then the nasty chemical scrub for that deep clean and another hosing. Whilst doing this, I noticed that some of the support timbers had rotted, and would need replacing sometime in the next year. I knew it would happen sometime, since the building inspector told me as much when we bought the place, I just didn't think it would be this soon. Folks, don't use pine for support timbers on a deck that is exposed to the elements. Use hardwood, or something that won't rot.

Anyway, by the time that was all done, it was time to go to Neeks folks place to help lift pavers in their front driveway. It was going to be a bit of a long afternoon thing, so we packed Moo Moo (or Missy Moo Moo, as she is formally known as) in her travel cage, packed an old wooden pallet and wheel barrow in the borrowed trailer, and hooked that up to the car (the trailer, that is: Moo Moo gets the front seat, with Neek).

So we're driving along, and just before we hit the Captain Cook bridge, I notice possible flapping of the rope holding the wheel barrow down in the trailer. Oh crap. A quick exit at Stanley Street, and bit of an adjustment later, we're back on the free way.

It was a pretty nice day out, and Riverfest was on with the jet planes planning to do their last low fly over and after burn thing that evening. The freeway had a bit of traffic but nothing too busy. Just the usual unexplained slowing of traffic up ahead.

Then just as I was approaching the exit for the Inner City Bypass at the start of Coronation Drive, the car starts to shudder, and it feels like the trailer is jumping around. Bloody hell, what now? I'm thinking I've got a flat, but I just can't stop. So I take the Little Cribb Street exit, and pull over to check out the suspected flat.

Well, no flat. All tires seemed to be quite inflated. So what the hell happened? Then I see it. A swath of metal mesh jutting out of one of the trailer wheels. Oh, bloody joy. The tire was totally cactus. Well, no problems. I'll just swap it with the spare from my car. Jack comes out, nuts are loosened, trailer lifted, wheel off, wheel on. I said, wheel on. Oh bugger, the wheel won't go on. My car has wheels with four bolts, the trailer has wheels with five. Bollocks.

So phone calls are made. I mentioned earlier that the trailer was borrowed. It was borrowed from Annikas parents. We were heading there anyway, so they get the first phone call. They've got a car with a five bolt wheel. Yay! Neeks' sister is on the way with the spare wheel. It's now about 2pm. We left home around 1:30pm. It's going to be about 25min before Catherine arrives, so Neek wanders around to the local fast food joint to use the facilities, and to see if the petrol station there has any tire options, just in case.

So I finally figure out what Twitter is for. It's for saying something that everyone, or no one, will read. Since I hadn't tweeted yet, the only people to read it would be those following me on Buzz. Maybe I'll hook Facebook into Twitter some other day. It was supposed to read something like "Busted trailer tire. Saturday arvos on Little Cribb St are tops." Thanks to the auto spelling on iPhone, it came out slightly different. So there it was, my first tweet on the Twitter, complete with spelling mistakes. Awesome. There was a follow up. Moo Moo was happily preening in her cage, on the front seat with the window down, while I stood outside. I couldn't hear what she was saying, but I could tell it was happy chats.

That's about all the energy I have for completing this blog post, but I'll try and cover the rest in as few words as I can.

Moo Moo gets seed stick. Neek returns with fries and hand feeds YT, who has dirty hands. Catherine arrives.

Tire fail, it's too wide for the trailer. I call RACQ. Cathy calls 24 hour tire place. RACQ can only do a tire service, so I tell them I'll call back if the 24 hour tire guy can't help. Neek looks up her car on the internet to check tire specification. Its 175SR14, just like the trailer tires. Awesome.

Neek and Catherine head back to ours to get the spare. Get phone call from Neeks Dad, and update on options of Neeks spare, 24 hour tire guy who will call back soon and tow option.

Tire fail number two. Neeks car only had four wheel bolts too. Neek and Catherine hit Google. Neek and Catherine drive to local Kmart Tire and Auto that's open 'til 4pm. It's currently 3:44pm.

Tire fail number three. They don't sell wheel hubs, and need the rim, which is inconvenient located in the trailer at my location. Neek and Catherine drive back to me. Neeks dad is on his way to investigate the situation, and considers driving with the bung tire. I get call from 24 hour tire guy, who doesn't have anything new in that size on a Saturday afternoon. I'm pretty chuffed he called back, and since I missed his call, I decided to call back and thank him.

Neek and Catherine arrive. Keith (Neeks dad) arrives. Keith surveys that damage of the tire, and it doesn't take much to convince him that driving with that thing would be a danger to everyone. Miltary choppers fly overhead. Must be prep'ing for Riverfest.

Neek organises tow with RACQ. Neek and Catherine stay with the trailer and Keiths car, to accompany RACQ tow guy back to our place. Keith takes Catherines car back to his. I follow in mine, and a wrong turn somewhere along the way, and finally get to help with lifting pavers until it's too dark to tell the good pavers from the bad. RACQ tow guy turns up in record time, and Neek and Catherine finally arrive just after dark.

Shower. Dinner. Skullfck joke on Facebook via iPhone. Drive home. Put Neek to bed. Blog.

And now it's time for a bit of WoW.

Tuesday, August 17, 2010

Blizzard maintenance still blows

Tuesday night. Date night. So called, because it's the one night of the week where WoW servers go down for maintenance, and I can spend some quality time with my better half. That is, until she goes to bed, or I have one of those sleep deprivation catch up moments where I can't keep my eye open past 9:30pm.

Assuming I haven't just fallen asleep, I tottle off to the computer and play... something else. Because WoW is down for maintenance.

Well, I thought StarCraft II might change that a little. I thought it would give me the option of another quality Blizzard game that I could play when WoW servers are down.

But not tonight. Even though I just want to play the single player campaign, or even a couple of matches against the A.I., I can't because SCII requires signing into Battle.net. And even though I'm located in the South East Asia (SEA) region of Battle.net, which is unable to communicate with the Americas & Oceanic region of Battle.net (for cross game Real ID chatting, as most Americans get to experience), the SEA Battle.net servers have also been taken down for maintenance during peak play time for the SEA region.

Makes me wonder if there really is a physical SEA Battle.net server hosted locally to this region, or if they plonked it in L.A. or Hawaii to get it as close to the region as possible without having to invest in any infrastructure outside of the U.S.A.

Maybe I'll be looking forward to Guild Wars 2 for my WoW downtime play. A second MMO where all you pay is the price of the game, without the "what's the catch?" feel of a Free 2 Play/Try is looking to be my idea of a tasty cuppa. That, or Portal 2, whenever that gets released.

Monday, August 16, 2010

What about the people on the boats?

I've been thinking about this post for the better part of 24 hours, and have decided I can't structure it nicely to make any sort of constructive, informed, coherent argument. So I'm just going to post it like the splatter pattern it represents: the latest LNP radio jingle about "stopping the boats" is giving me the shits!

I'm not really a politically minded person. Political discussion doesn't excite me, and I'm not informed enough to give constructive comment (I may have mentioned that just sentences ago). But we've got a national election this Saturday, and if the LNP can come up with "stopping the boats" as an election standpoint, then I reckon I could garble on about all sorts of crap.

So, this radio jingle I heard on the weekend goes on about how Labor couldn't "stop the boats". And that's about it. No informative policy on what LNP are going to do. But that's not what shat me the most. It was that I just find the whole "stop the boats" argument bloody insulting and inhumane.

My initial selfish take is that I'm a boat person. Difference is, I arrived by plane, and with my parents when I was 10 years old. They came to Australia, looking for a fresh start and new opportunities. My parents weren't even trying to escape an oppressive regime: they arrived from New Zealand.

My second take is that the whole jingle sounds like one of those racist viral emails trying to disguise itself as humor or insightful commentary. It's really disappointing, but not so surprising. The Ugly Australia is still alive and kicking, and the scary thing is, there might be enough ugly Australians out there that they may even win.

I'd rather hear about a constructive policy that deals with efficient processing of immigrants, and people looking for refugee status. I'd rather hear about plans for diplomacy that address the root causes of why people are leaving their home countries, looking for a better life. I'd rather hear about the life stories of these people, willing to risk their lives to a shoddy boat and the wild seas for a chance in a place where, by all accounts, they're not welcome.

LNP will be getting preference votes somewhere around Family First and the pro Christian lobby groups, from me.

Friday, August 13, 2010

Integrating CakePHP with legacy and CodeIgniter

This post will discuss how I integrated CakePHP with a legacy application that had already been integrated with CodeIgniter.

Motivation


Sometime ago, I decided to make a legacy application a bit more of a pleasure to maintain by introducing MVC. At the time, CodeIgniter offered the best flexibility, with regards to integration in the realms of session management, and models. I was already doing projects in CakePHP, but at the time, it didn't seem to be flexible enough. The might have been back in late 2008.

So I kicked off the initial integration, then ended up passing alot of the work on to my minions (Hi minions!). Earlier this year, my minions ceased being my minions (bye minions. I miss you!) and I've had to do a bit of maintaining and new project work with the legacy application. Having worked so long with CakePHP, CodeIgniter was like a very unwelcome swim in the middle of winter. The thing that erked me the most was the lack of a built in ORM. I went looking for some, but the recommended ones were quite strict, and that wouldn't do for integrating with the legacy database.

I had already done some work with integrating CakePHP with CKEditor (for sessions), and CakePHP 1.3.3 did seem to be at a point where you could use it's models for legacy database table naming conventions. So I bit the bullet, loaded it, and pointed it firmly at the existing CodeIgniter part of the application and said "No more".

Preparation


Unlike my previous article regarding integration of CakePHP and CKEditor, I won't be posting alot of code in this article. If you're going to do something similar with your legacy product, you really need to understand how your legacy app works, and you need to understand how CodeIgniter works (if you're looking at that bit too).

Now, before reading much further, you might want to read about integrating CodeIgniter with a legacy application. This will give you a run down on the existing structure. Then have a read of integrating CakePHP and CKEditor, and a few of the same methods from that will be used here.

The Big Picture


So what's novel about what I'm doing? Well, I've got one existing framework that works with mod_rewrite and some legacy mush that calls the PHP directly, and now I'm about to introduce a second framework that also uses mod_rewrite, and all three will need to be able to share the same session, and not trip up on each others virtual directories. Let us take a look at the directory structure.

To protect the guilty (that would be me), I'm going to give the legacy app an imaginative name, different but not far removed, from the actual name of the legacy application. Lets call it LDS for Legacy Database System. Recently, it had a rename, but hasn't actually been formally rebranded. This has helped a little with the real application. We'll go with a rebranding of NDS for New Database System.

Here's a sample of the tree structure:

src This is the root of the application. It's has an index.php, plus ci_index.php, ci_open.php and ci_close.php that was used for CodeIgniter integration.

src/lds This is the main directory of the legacy application. When I first started, there were only a few subdirectories and a whole bunch of programs sitting in here. Yucko. There are a few more directories now, trying to group like portions of the legacy app, but there are still quite a few programs floating around in there. When you visit the website, you'll visit http://localhost/lds/. Anything with the lds in the URL is legacy code.

src/lds/codeigniter This is the CodeIgniter directory, as per the normal CodeIgniter directory structure.

src/lds/system This is the CodeIgniter system directory, as per the normal CodeIgniter directory structure. There's a libraries and language as well, and whatever others ones are required, however, I thought I'd just list this one so you get the picture.

src/lds/application This is the CodeIgniter application directory for the pure CodeIgniter code. If you were visiting the Documents controller, implemented with CodeIgniter application, you would visit http://localhost/documents.

src/lds/legacy This is the CodeIgniter legacy application directory. It's a cutdown version of src/lds/application used by the legacy application to establish a CI instance for session access.

src/nds This is the main directory of our new CakePHP stuff.

src/nds/cake This is the CakePHP directory.

src/nds/app This is the CakePHP app directory. When you visit the Documents controller, implemented with CakePHP, you would visit http://localhost/nds/documents.

With this arrangement, I didn't have to change the .htaccess file in src.

Having already applied changes to src/nds/app/webroot/index.php to cater for external applications, the next hurdle is logins and session sharing. Since most of the new work will be done with CakePHP, I've decided to use CakePHPs sessions, and allow the legacy app and the CI framework to access them.

CakePHP and the Legacy App


For the legacy app, following the same method as described in the CKEditor article, I modified the code that does the session check to declare an external app, include the src/nds/app/webroot/index.php, import the Session class, start a session and read the Auth.User. For new sessions, I created a new login screen via CakePHP, providing login and logout methods.

As a side note, the legacy app doesn't have tricky ACLs, and users passwords are stored as plain text in the database. To this end, I overloaded the default AuthComponent with a NDSAuthComponent that assigns itself to $this->Auth during initialisation (so you can pretend you're playing with the default Auth component), and does not hash the password.

In my travels, I also came across something else that was interesting. I needed to call session_write_close() during the destruction of the CakeSession object. I thought I'd try and "do it right" by extending CakeSession, and overriding the __destruct() method, and then create a new Session.save type called nds_session. However, database connections seem to be semi-hardcoded for cake_session types only, so I also had to override the __construct() method, and copy the same processing for nds_session, has happens for cake_session, with regards to connecting the default Session class to the datasource. In turn, I then had to replace the SessionComponent with a version that extended NDSSession instead of CakeSession.

That almost covers the easy stuff for CakePHP and the legacy application. There's one last change regarding accessing legacy code. There are some nice library calls in there that I'd like to access from CakePHP, instead of having to copy the code into a Component or Helper or Library and maintain it twice. So I needed a way to include the legacy app on the CakePHP include path.

Also CakePHP and CodeIgniter do not play well together. I could not include both frameworks within the legacy code, so I had to remove all references to CI in the legacy code, and replace them with calls to CakePHP. This was mostly for session calls anyway, so it's no biggy. However, I did have one component, written in CI that handled a recently accessed client list. I haven't dealt with it yet, but will probably have to write an equivalent helper/component in CakePHP.

To do this, I altered src/nds/app/webroot/index.php to define a LDS_ROOT and include it in the include_path setup. I also added some define statements to prevent legacy code from reconnecting to the database if called from CakePHP, since this reconnection tends to ruin the database connection for the remainder of the CakePHP processing.

CakePHP and CodeIgniter


The next part was to integrate CakePHP and CodeIgniter. Unfortunately, these two frameworks to not play well together, so the only real integration I could achieve was to provide a Session class to CodeIgniter that could read CakePHP sessions.

This involves overwriting the CI_Session class with a version that conforms to the method signature of the original CI_Session class, but also handles session management the same way that the CakeSession does. That includes timeouts, Security.salt config, User Agent checking, Session regeneration, and Flash messaging. This is the code I'm not going to give over.

It's a horrible hack.. a mish mash of the CakeSession and CI_Session, using CI for database access, and including the CakePHP String and Set classes for reuse. So while you've not got the code, you've got the basics of how to create your own CI_Session to use CakeSessions.

The End!


Now, I'm still in the middle of the development, and not everything has been tested, especially passing Flash messages from CakePHP to CI and visa versa. Hopefully, it's not something I'll have to deal with, but it's something on the list to take a look at.

I hope this has given you a little insight into how to integrate CakePHP with your legacy application.

Wednesday, July 28, 2010

Starcraft II : First Impressions

Every man and his dog has probably done a StarCraft II : Wing of Liberty review when they got their hands on a beta key. But given that I didn't have my beta indicators on at the appropriate time, and are particularly craptacular at Real Time Strategy games, I'm posting my first impressions review one the second day of release.

So far, I've only taken part in the campaign. I'm a big scaredy cat, and don't like pitting myself against real people unless I know the tools at hand. And given that I haven't played SCI since at least 1997 (and I might have even stopped playing in 1996), I'll stick to the campaign until I've been through a fair whack of the single player mode.

So, after the initial installation last night, and adding the game to my Battle.net account, I quickly went through the tutorials. They're great. They give a nice quick rundown on how to move, attack, build, do supply and the like. They're also a good introduction to some of the keyboard layout.

Keyboard layout. So far, it's okay, but I think I may end up remapping keys. I'm very used to playing keyboard and mouse, and usually run with an ASDW setup for movement and strafing with the occasional SDFE, depending on the number of abilities a particular character may have. SCII has alot of these keys mapped to unit abilities on the character card (I think they called it that, or was it game card. The bottom right panel, anyway), and you either use mouse, arrows or the minimum to move your view port. Swapping my left hand from arrows to keys (which are mostly on the left side of the keyboard, especially A for Attack) can be a little frustrating. Maybe A will get a special mapping on my mouse.

After completing the tutorials, it was in with the campaign. I played the first one three times, to cover off normal, achievements and hard mode achievements. As a testament to my crap playing, only Raynor and a bunch of the civvies survived the Hard mode, so I didn't bother with Brutal mode.

The between game interface is really well done. I had a bit of fun playing the arcade mini-game in the cantina and noticed the dance of the hologram was based on the Night Elf dance model.

Iffy things were the achievements server and RealID.

Part way through the infestation quest, I got a notification to say that achievements would not be available until further notice. At this point, I'm not playing the game for achievements. That's something I'll go back and do later. In a few weeks or months, the achievements server should be more stable. However, that would be quite annoying if I were that type of playing, and it kept going offline.

Also, since I'm located in Australia and have a SEA version of the game, I had to add my RealID friends again, and I don't get the cross game communications with friends playing WoW. Blizzard have said that SEA players will have the option of installing the North American version of the game so we can play against friends in the US, or just have cross communications with people playing WoW (on US servers). I think Blizzard have missed the mark with RealID for people in the Oceanic region. It becomes apparent that their authentication servers and chat servers are tied together, and even though there might be some replication between authentication servers in North America and SEA. I guess if I didn't play WoW at all, and didn't know anyone playing WoW, it wouldn't matter. It just seems like a bit of shiny has been tarnished. It's a small thing and has not lasting impact on the actual game. If anything, it's commentary on Blizzards effort to break into the social networking scene. And for the record, I'll not be linking SCII with Facebook, ever.

Anyway, I want to end this first impression on a happy note, since I'm very happy with the game. Having upgrade choices, mercenary hires and research trees that don't have to be selecting in the heat of battle is a big plus. Having interactive features between games to progress the storyline is also good.

Somehow, this review seems incomplete, but I've got other stuff to do (like work). Maybe I'll post some more thoughts when I've taken part in the multi player side of things.

Tuesday, July 27, 2010

Starcraft II in my hot little hands.

In Australia, Starcraft II : Wings of Liberty is rated M.

That's M for mature. And that's me, right?

Right?

Monday, July 26, 2010

Regretting Codeigniter

Once upon a time there was a legacy PHP application. It was written with no particular framework in mind, and no particular structure to where code was situated. For a while, it even used ADO DB, ODBC and MySQL to access the same database. Yeck.

Then one day, a developer came along. He tried to make improvements to the application, to make it easier to maintain and to make it easier to add new functionality to the product. He normalized database access to just use MySQL, and tried to at least arrange business functionality into libraries, with similar functions grouped together.

This was fine for a while, but things were still rather haphazard, and the developer knew there was plenty of room for improvement.

He decided to impose an MVC framework on the legacy application, to help with new developments. But which MVC framework to use, he was not sure. He had to make sure he could to session and authentication from both the legacy code and the new MVC code. He also had to make sure that the framework models had to be flexible enough to handle the legacy database, without having to visit every program in the system.

The developer was new to MVC in PHP. He'd worked with MVC in Java, and had a pretty good idea how it was supposed to work, but applying the same principals to PHP was going to be interesting.

He tried Zend for a start, and managed to get authentication and shared sessions working, but Zend was just messy to read. The elongated class names hurt his eyes, and threatened to turn one line of code into several, just because of where the class lived in the code structure.

While he tried to decided, he worked on other PHP projects. These projects were fresh-ish. They had the opportunity to start again, but were often based on previous projects to try and get some code reused.

One of the projects used SMARTY Templates. SMARTY was but one part of the MVC equation, but the way the developer ended up using it was very nasty. SMARTY also introduced it's own mini language, which was supposed to introduce a degree of independence from PHP. But nonetheless, it was another language, even it is wasn't PHP, and would still present a problem for non programmers, so why bother. Never again would the developer touch a template system not already included in a MVC framework, and never again would he touch one that tried to introduce a separate presentation language, when the native one was perfectly serviceable.

For another project, he tried CakePHP. CakePHP was very young at the time, but seems to have some reasonable documentation and a community that was excited about what it offered and a development team that was excited about delivering that offering. CakePHP was yummy and great for fresh projects. Favouring convention over configuration, and tools to automagically generate code for models, views and controllers, puuting together the basics for even large projects was fairly easy. CakePHP even allowed some configuration of models to allow for legacy databases. The only downside was it's session and authentication management. Tried as he might, the developer could not find a way to simply include hooks for his legacy application into the CakePHP framework to share session and authentication data.

Finally, the developer found Codeigniter (CI). CI was also MVC based, and also seemed to have a community and development team that was excited about the framework, as the CakePHP community was excited about theirs. Plus, CIs models were super flexible, and used Active Record for database access. This would allow the developer to handle all the weird intricacies of the legacy database. Confident in his choice, the developer integrated CI with his legacy application that would allow access to the session and authentication functions of the legacy system, and allow new parts of the system to be developed with MVC in mind.

The developer passed the new version of the product to his team members, and off they went. The legacy application would live a few more years and have new life breathed into it.

The developer went back to his CakePHP projects, and loved them so. He spent alot of time, alot of good times writing new functionality, quickly, easily, cleanly.

Then one day he has reason to revisit the legacy application and CI, and it was not as he would have liked. He discovered that Helper Libraries weren't actually classes. They were just collections of regular functions and could not be overridden the was a normal class could. He discovered that the flexibility of the model was like being offered flour, water and yeast and being told it was bread. There was still also of repetitious work to be done, and it made him yearn for CakePHP.

Months later, the developer sits alone. His team has gone and he must live with his choice. He once went looking for an ORM tool to make modeling, and especially modeling relationships easier. He thought he found a savior in DataMapper OverZealous Edition, but it was not flexible enough to handle the legacy database. He has not looked at Doctrine yet, but he's not sure he wants to. Anything less than CakePHP is just flour, water, yeast and sugar that tastes like the salt in his tears.

The developer has considered Lithium. Not in the medicinal sense, silly, but another MVC framework, born of CakePHP but with a different philosophy of being lightweight and taking advantage of advances in the PHP language. Unfortunately, it is because Lithium will only serve PHP 5.3 the developer can not use it. He must support 5.2 for the sake of his legacy application.

The developer has even found a way to integrate newer versions of CakePHP with legacy code, so the session and authentication information can be shared. He's just not sure if he can safely integrate his legacy PHP application, CI and CakePHP all in one product. But the more he things about it, the more he thinks he must try.

The developers sadness flashes red with anger. He regrets choosing CodeIgniter. He'll ignite the code, all right. Ignite the code and make cake from its ashes.

Tuesday, July 20, 2010

Hello, Mister Seven. Part 2

Well, Windows 7 64-bit is finally installed, but not without some frustration.  I should point out from the get go, that this is a Windows 7 Home Premium 64bit upgrade from Windows Vista 32bit Ultimate OEM.


The story unfolds in point form:
  • Get home from tae kwon do.
  • Install 8GB of new RAM.
  • Install new 1TB SATA HDD, making sure it's level in the case.
  • Scramble in my box of old computer shit for a SATA data cable.
  • Unplug the old SATA disk, and plug the new one as primary.
  • Start the machine.
  • Put the Win7 64bit disk in the DVD tray.
  • Kick off the Windows 7 install, via a Custom Install (so far, so good).
  • Enter a username, enter a password, enter a Product Key
  • Invalid Product Key.
WTF?  Why the face?  Why the FACE!?  Because WHAT THE FUCK!?  I got very shirty very quickly.  I tried my local IT expert (my girlfriend).  I tried her local IT export (ex-co worker). I tried the internet.  There's alot of shitty information out there about upgrading from Vista 32bit to Win7 64bit.

Here's what I ended up doing, but you can probably cut straight to the bit that actually works.  I plugged the original HDD in as the primary, and the new one as the secondary.  I downloaded and ran the MS Upgrade Advisor, which told me that my install was good to upgrade to 64bit, though it would require a custom install, as I had already done.  I then googled for "win7 product key invalid", and found a tidbit that mentioned the upgrade product key would be recognised as invalid if it could not find an older version.  It also said if you needed to reinstall Win7 at anytime, you would need to reinstall the original product you upgraded from, and then do the Win7 upgrade.  God help you, if you're not upgrading to a new HDD.

Anyway, I shutdown and swapped the SATA connections so my new HDD was the primary and my old one was the secondary.  I figured that perhaps the Windows Anytime Upgrade software on the install disks might scan the other HDD and recognise the old Vista installation.  Approx 20-30min later, I was back to putting my product key in, and away it went.  Yay!

The one thing I was pleasantly surprised with was that only a couple of Windows updates were required (well, 5 downloads to be precise), it it looks like it has also install the NVIDIA control panel for me.  I downloaded the latest version, just to be sure.

Tomorrow night, I will tackle installing things like WoW, the Curse Client, Steam and a bunch of other convenience stuff.

Monday, July 19, 2010

Bye-bye, Mister Vista. Hello, Mister Seven.

  • 8GB of DDR2 memory. Check
  • 1TB HDD. Check.
  • Win7 Home Premium upgrade. Check.

Tonight, the old HDD gets unplugged, the memory gets replaced and my old Vista machine starts along the road to becoming a Windows 7 machine, 64 bit and 8GB of memory to have its way with.

The reinstall is going to take a while, I know.  And eventually downloading a few of my old games (looking at you, Steam) is going to take a while, too.

On the other hand, I'm less likely to have out of memory errors in WoW, as I play that, and do a few other things on the side.  And it's a way to extend the lifetime of the rest of the machine, without buying a whole new one.

I've also got a Sharkoon QuickPort combo eSATA on order as well.  This will come in handy for a few IDE drives that I may want to quickly check, without having to get a dedicated external case for.

Wednesday, July 14, 2010

A Git for my sanity

A few months ago, I converted all my SVN projects at work to Git. This was mostly due to the messiness that is SVN properties when doing merges. I was hoping Git was going to cut down on that, and it has somewhat, but I feel I'm missing something.

Here's a short description of one of our products. The state of play when we moved from SVN to Git was a bit shakey, as far as branches are concerned, so only after the next release will Git be working like I want it to, and how it should.

For now, I've got three "branches" to deal with.

The first branch is my "prod" branch. It's actually called "prod-20100612_email_paed_recalls", and it's a bugfix branch for what is currently live on the one site that I have to maintain it on (lucky me, it's a custom application, so I control releases to site, etc). In perfect Git work, this should actually be the "master", but for now it's not.

The next branch is the "master" branch. At the moment, it contains the next branch of work that will be delivered in the next release. I deliver that to an external environment called Staging, where the clients get to test before it all goes live. When it does go live, then I can do away with the prod branch, and use the master branch how it was meant to be used. One thing I'm looking forward to in this branch is code delivery via Capistrano.

And my final branch is the "dev" branch. It's actually called "dev-20100712-mo" and contains the next batch of features currently in development for after the current "master" branch goes live. Fortunately, this branch was cut from the master, so at least there's properly maintained history.

Now, here's the workflow for when I have to update any of these branches. If you've got any suggestions as to how to do them better, I'm welcoming any suggestions. I'll probably even post something to Stack Overflow, asking for help.

Oh yes, I should mention. Because I have to make regular updates in each of these branches, I've decided to have a clone of each one. I also have a remote called origin, where they all get pushed to, and backed up.

Workflow One: The Bugfix for Production.

So, I end up making a change to the "prod" branch because of some production branch (suffering alot of "this was here, now it's not" issues because forward migration in SVN was a PITA).

Once I've committed the change in prod, I go to my master directory, and do the following:

$ git checkout prod
$ git pull
$ git checkout master
$ git cherry-pick XXXXXXXX
$ git push

Because prod and master don't have the nice branch relationship that master and dev to, I have to cherry-pick my changes across. I'm okay with this. I haven't tried it yet, but I'm hoping to streamline this down to (from master):

$ git pull
$ git cherry-pick XXXXXXXX
$ git push

Next step, is making sure this production fix makes into the latest development release, so over I go to my dev directory. And I do:

$ git checkout master
$ git pull
$ git checkout dev
$ git merge master
$ git mergetool && git commit # usually
$ git push

The problem with this is I always end up with normal merge conflicts, for which I have to run the mergetool. SVN was able to deal with these with no prompting, so this is a step for which I feel I'm missing something.

Possibly a better way to deal with it is (from the dev branch):

$ git pull
$ git merge origin/master
$ git mergetool && git commit # if needed at all
$ git push

The only downside is the master in the dev directory gets out of date. Theoretically, it shouldn't be a problem, it should just be a matter of checking out the master branch and doing a pull. It hasn't really worked out that well, so far.

Workflow Two: The Bugfix for Staging.

This is pretty much the same as the first workflow, except I only have to deal with the master and dev directories, and there's no cherry picking involves.

Workflow Three: Development.

Well, this is essentially the last step of the last two workflows. I do some development, commit the work, push it to origin so it gets backed up (and one day, another developer may get to play with it), and then I want to pick up any changes that have happened in the master so they don't get lost ("they" being bugfixes to either production or staging).

I just hate seeing every second line in my dev branch gitk view reading "Merge branch 'master' into dev-20100712-mo" (or "Merge remote branch 'origin/master' into dev-20100712-mo", for the one occasion I've tried so far).

I'd rather update sooner and often, and I feel that's the right thing to be doing, but it just "looks" messy, and I'm not sure that leaving the merging of master back into dev to pick up bug fixes on the near completion of the dev feature branch is the right thing to do.

Any helpful hints for a seasoned Git user?

(Now just got to whittle that down to something consumable for Stack Overflow).

Friday, July 9, 2010

Fun and Games with onsubmit()

I've been doing programming for the browser interface for quite a few years now (maybe as far back as 1998, using Progress WebSpeed 2.X), and still, I'm learning things all the time.

Todays lesson was the form event, onsubmit().

With some actions in a web page, you want to make sure that some things are really meant to happen, for example, if you are deleting an entity from a back end database. The first defense (after authentication and access control lists) against accidental actions is making sure the delete gets issued as a POST, rather than a GET. And if you're paranoid enough, the next step is getting a confirmation from the user, since some users are still prone to jittery fingers and random clickiness.

So, I have this page where timesheets can be submitted for approval, deleted, and a bunch of other actions. Actions like approval, rejection, submission and deletion all have a confirmation, using the javascript confirm() function.

Side note: Is that the right spelling? Delete is to deletion, as submit is to submi.....?

And all the forms asking these confirmations had them set up on the onsubmit event handle for the forms in question.

Now, my menu structure is a bit special, so for the most part, there is not an actual form to fill out. When the user selects the Submit action from the menu, I programmatically trigger the submit event on the form, via submit(), and would expect the onsubmit handler to be called, run the confirmation and either submit or not submit based on the user selection of Yes or No, Okay or Cancel, or whatever confirm() does in your flavour of browser.

It turns out, onsubmit() is not supposed to be trigger programmatically. It's only supposed to be triggered if there is a submit button in the form, and a human selects that submit button. In my experience with Firefox 3.6.6 and Chrome 5.0.375.99, the onsubmit() function gets run, but the return value of the event handler is totally ignored.

This pretty much makes onsubmit quite useless as a catchall for validating a form, regardless of how the form is submitted.

In the end, I created a regular Javascript function for each confirmation type (i.e. confirmSubmit(), confirmApprove(), etc), and placed onclick events on each menu link to call that function. (i.e. onclick="confirmSubmit(); return false;").

In the function, I can call confirm() and then call form.submit() based on the results.

I've compromised with my programming principals for this one, since there doesn't seem any other option that is as simple to maintain. The only saving grace is that these forms do not have a visible aspect on the screen. The other forms on the page that actually take user input are making extensive usage of jQuery, dialog and validate.

Tuesday, July 6, 2010

Using Prolog for Cataclysmic Purposes

I wasn't too sure which blog to put this under, but I think I'll make this the main entry, and reference from my other one.

There's a World of Warcraft expansion coming up later this year, and my host of WoW toons will be making the journey from level 80 to 85, plus one extra Worgen Druid experiencing Azeroth all over again, from level 1. I've got a couple of other high level characters on other realms, but unless Blizzard do something about the 10 character limit per realm, I don't imagine they'll be doing much.

Anyway, one of the items facing altoholics in the expansion is which character to level to 85 first, then which second, etc. There are many factors to consider, such as profession synergies, money making abilities, survivability and the fun factor. I think the fun factor will win out for me, but it's very handy to have a host of toons which which you can craft weapons, armor, gems and elixirs.

Anyway, my inner nerd remembered a tool that I leaned back in high school that would be just perfect for a job such as this: Prolog. So I tottled off to download Visual Prolog, but since this is just an idle experiment in decision making, I was happy enough to use PIE (Prolog Inference Engine), available within the free examples. PIE most closely resembles what I worked with in high school, without having to get bogged down by all the Windowsy stuff that seems to take more focus in the tutorials than it's worth (or maybe it is worth it, but I just wanted to dive into the stuff I remembers, instead of putting a Window and menu together).

Anyway, a few short Prolog statements later, and I've come up with some basic facts to help determine what I might want to level first.

This set of facts should be common to all WoW characters, but I'm not sure if it's exhaustive. It certain doesn't take advantage of tailors being able to gather extra cloth, or enchanters being able to provide enchants.


armorClass("Deathknight","plate").
armorClass("Warrior","plate").
armorClass("Paladin","plate").
armorClass("Hunter","mail").
armorClass("Shaman","mail").
armorClass("Rogue","leather").
armorClass("Druid","leather").
armorClass("Priest","cloth").
armorClass("Warlock","cloth").
armorClass("Mage","cloth").

crafts("Blacksmith","plate").
crafts("Blacksmith","mail").
crafts("Leatherwork","leather").
crafts("Leatherwork","mail").
crafts("Tailoring","cloth").
crafts("Jewelcrafter","gems").

requires("Blacksmith","ore").
requires("Leatherwork","leather").
requires("Tailoring","cloth").
requires("Inscription","herbs").
requires("Jewelcrafter","ore").
requires("Engineering","ore").
requires("Alchemy","herbs").

gathers("Miner","ore").
gathers("Skinner","leather").
gathers("Herbalist","herbs").


The next facts give the state of play for my toons on Aman'Thul.


toon("Blackthorn","Deathknight").
toon("Pathak","Warrior").
toon("Colerejuste","Paladin").
toon("Nevynoch","Hunter").
toon("Fidgette","Rogue").
toon("Anion","Druid").
toon("Benzol","Priest").
toon("Bojsen","Warlock").

hasProfession("Blackthorn","Herbalist").
hasProfession("Blackthorn","Alchemy").
hasProfession("Pathak","Miner").
hasProfession("Pathak","Blacksmith").
hasProfession("Colerejuste","Jewelcrafter").
hasProfession("Colerejuste","Miner").
hasProfession("Nevynoch","Herbalist").
hasProfession("Nevynoch","Inscription").
hasProfession("Fidgette","Engineering").
hasProfession("Fidgette","Miner").
hasProfession("Anion","Skinner").
hasProfession("Anion","Leatherwork").
hasProfession("Benzol","Tailor").
hasProfession("Benzol","Alchemy").
hasProfession("Bojsen","Tailor").
hasProfession("Bojsen","Enchanter").


And these handy statements let me discover stuff that I could have worked out on paper, but would rather marvel at the revival of my high school level Prolog skills.


wears(Character, ArmorClass) :-
toon(Character, Class),
armorClass(Class, ArmorClass).

gathersFor(Gatherer, Crafter) :-
hasProfession(Gatherer, GatherProfession),
gathers(GatherProfession, Material),
requires(CraftProfession, Material),
hasProfession(Crafter, CraftProfession).

craftsFor(Crafter, Character) :-
hasProfession(Crafter, CraftProfession),
crafts(CraftProfession, ArmorClass),
wears(Character, ArmorClass).


With these basic statements, I can see who my most useful gatherers are, for leveling professions (gathersFor(X,Y).), and who can support each other for crafting gear (craftsFor(X,Y).).

Sunday, July 4, 2010

Integrating CKEditor and CakePHP : Part 6

In Part 5 of the series, I covered how to check the CakePHP session from CKFinder to see if there was an authenticated user. There were a couple of shortcuts taken, so in this article we look at using the Auth component to check for authentication. As yet, this still does not cover ACLs in CKFinder.

Now, the previous method had CKFinder take advantage of inside information, that the authenticated user was stored in the session, and that the authentication model was the User model. We were also assuming that just because you had a session, it also meant you were allowed to upload files. While this might be try for some controllers, it might not be true for all.

While I was trying to work out how CakePHP authentication could be integrated into CKFinder, I got the chance to take another closer look at how CakePHP authentication works.

In my example code, there's very little setup for authentication, while means that as long as I have an authenticated session, I am considered authenticated for all controllers. However, this might change if I use the controller method of authentication, where each controller overrides the isAuthorized() function to determine is a user is allowed access to any actions on that controller. Or I might use the actions method of authentication, where ACLs are used to determine if a user is allowed to access a particular action (this has been my preferred method of authentication and access control, to date). There are other methods such as CRUD (ACL on actions mapped to create, read, update and delete classes), object (isAuthorized() function on any object) and model (like object, but just for models).

Suggested read for the brief overview is the CakePHP Book section on Authentication, and for a more detailed view, go straight to the AuthComponent, but look at the source, as the summary of methods might be a bit too vague.

Anyway, to integrate with CakePHP Authentication, we have to make use of the Authentication Component. Components usually expect a controller, and in this case, a controller and an action are needed. For our simple authentication, we going to say "if you're authenticated to access the contents/edit page, then you're allowed to use CKFinder". To do this, we need to communicate to CKFinder the controller and action we'll be working with, which is done via the ckeditor element. Update it with these new session variables after setting up the other session variables.

$_SESSION['controller_name'] = $this->name;
$_SESSION['controller_action'] = $this->action;


And then we update the vendors/ckfinder/config.inc.php to pick out the controller and action, start up the controller and use the Auth component to determine if the user in the session is authenticated to access the controller and the action.

<?php 
/**
 * This file is included by the CKFinder config.php, and will
 * set up the basePath and permissions, as is specific for the project
 */
 
define('EXTERNAL_APP', true);
// starts from app/webroot/ckfinder/core/connector/php/connector.php
include_once '../../../../../index.php'; // targetting app/webroot/index.php
 
App::import('Core', 'CakeSession');
 
$Session = new CakeSession();
$Session->start();
 
// What resource type are we playing with, Image or File
if (isset($_GET['type']) && $_GET['type'] == 'Images') {
 $baseUrl = $Session->read('path_to_dest_image');
 $baseDir = $Session->read('path_to_destsvr_image'); 
} else  /* if ($_GET['type'] == 'File') */ {
 // File is the default
 $baseUrl = $Session->read('path_to_dest_file');
 $baseDir = $Session->read('path_to_destsvr_file');
} 
 
function CheckAuthentication() {
 $Session = new CakeSession();
 $Session->start();
   
 $controllerName = $Session->read('controller_name');
 $controllerClass = Inflector::camelize($controllerName).'Controller';
 App::import('Controller',Inflector::camelize($controllerName));
 
 $controllerAction = $Session->read('controller_action');
  
 $params = array('action' => $controllerAction);
 $controller = new $controllerClass();
 
 $controller->params = $params;
 $controller->action = $params['action'];
  
 $controller->constructClasses();
 $controller->startupProcess(); 
  
 return !is_null($controller->Auth->user());
} 


You might notice a couple of improvements to the code. Because we're properly bootstrapping CakePHP, we now have access to the App::import() function to include CakePHP classes. There's also a chance that CKFinder will not pass the "type" on the URL, as we make sure it is set before doing anything.

Now, it this point, I wonder if there's any need to try and make use of CKFinder ACLs. Now that we have CakePHP Authentication integrated, we also have CakePHP ACLs integrated by proxy. There is one case where you might want to make use of CKFinder ACLs, and that is where you want to differentiate CRUD actions on files from CRUD actions on folders. This might be harder to set up and have feeding from CakePHP in a non-gimmicky way, since CakePHP ACLs deal with one access point for a single check, whereas CKFinder ACLs are dealing with two, files and folders.

At this point, I've just got vague suggestions for that integration. If you have set up ACLs in CakePHP, and are using groups or roles, then set the role of the user in the ckfinder element, then extract this in the vendors/ckfinder/config.inc.php and populate the $config['RoleSessionVar']. Though I'm not too sure how accurate it would be to call it a role, since a user may have multiple roles, and from a look at the ckfinder/config.php, you only get to choose one.

For the moment, that concludes my series on integrating CakePHP and CKFinder via CKEditor. I hope you found it educational. It was certainly an interesting journey for me, especially being able to bootstrap CakePHP into legacy code.

Series Index : Part 1, Part 2, Part 3, Part 4, Part 5, Part 6

Saturday, July 3, 2010

Integrating CKEditor and CakePHP : Part 5

In Part 4 of the session, I covered how to integrate CakePHP database session into CKFinder. In this article, I will cover how to integrate CakePHP Authentication.

This is only going to cover basic authentication for the moment. No tricky ACLs, and no roles.

First, we'll need to set up our application to have someone log in. I created a users table with id, username and password. Let the baking ensue!

Because I'm aiming for the simplest set up, and least amount of code, I followed the Authentication example over from the CakePHP Authentication documentation. Namely, use the Auth and Session components in the AppController, provide login and logout actions on the Users controller, and a login view. Once you've set this up, you may wish to temporarily set the allowActions on the Users controller to '*', so you can set up at least one user entry. Then return the allowedActions back to array('login','logout').

Now we can tie in CakePHP Authentication with our CKFinder configuration file.

Comment out or remove the CheckAuthentication() function from ckfinder/config.php, and add it to vendors/ckfinder/config.inc.php.

function CheckAuthentication() {
    $Session = new CakeSession();
    $Session->start();
  
    return $Session->check('Auth.User');
}


There are multiple ways to check for a valid authenticated session, but this is the easiest. Of course, it makes an assumption that your authentication model is the User table.

You may wish to use the Auth component directly, and use $Auth->user() to retrieve the user details. I'll probably use that mention when I take a look at integrating ACLs.

Series Index : Part 1, Part 2, Part 3, Part 4, Part 5, Part 6

Friday, July 2, 2010

Integrating CKEditor and CakePHP : Part 4

In Part 3 of the series, I covered how to use the session to communicate from CakePHP to CKFinder the directories in which to browse and upload files. However, that was assuming the usage of PHP sessions in CakePHP. This article will cover how to use CakePHP database sessions.

The goal of this is to write as little code as possible, and make the most out of what has already been done. The term Don't Repeat Yourself (DRY) should be familiar to most (CakePHP) programmers.

So I took a close look at what I could do to make use of the CakePHP sessions. I don't need to start a new one, I just need to access one that might already be there.

Since we already have information getting written to out session via the ckeditor element, I don't need to make any changes there. Although I could probably change the values of path_to_destsvr_image and path_to_destsvr_file to use the DS definition, and it probably wouldn't be a bad thing to use the Session component or CakeSession library (however, as yet, I'm still using $_SESSION).

All the action is going to happen in our app/vendors/ckfinder/config.inc.php file. But first, lets convert the app to use database sessions!

First thing to do is add the cake_sessions table to the database. My CakePHP command lines are a little whacky, but here's what I did.

$ cd app
$ php ..\cake\console\cake.php schema create sessions


That's not quite what they want you to do in the CakePHP 1.3 manual, but it's what I did, and it installed the new table just dandy.

The next thing is to update the CakePHP Core to use the database.

Configure::write('Session.save', 'database');
Configure::write('Session.database', 'default');


Hey presto! CakePHP database sessions.

What we want to do is bootstrap into CakePHP from the config.inc.php, without firing off the CakePHP Dispatcher. So I've made a small change to app/webroot/index.php that will prevent the dispatcher from being run in the presence of a defined variable, EXTERNAL_APP. Almost at the end of the file, make this change:

if (isset($_GET['url']) && $_GET['url'] === 'favicon.ico') {
    return;
} else if (!defined('EXTERNAL_APP')) {
    $Dispatcher = new Dispatcher();
    $Dispatcher->dispatch();
}


Now we can update the vendors/ckeditor/config.inc.php to access the CakeSession object.

<?php 
/**
 * This file is included by the CKFinder config.php, and will
 * set up the basePath and permissions, as is specific for the project
 */
 
define('EXTERNAL_APP', true);
// starts from app/webroot/ckfinder/core/connector/php/connector.php
include_once '../../../../../index.php'; // targetting app/webroot/index.php
 
if (!class_exists('cakesession')) {
 require LIBS . 'cake_session.php';
}
 
$Session = new CakeSession();
$Session->start();
 
// What resource type are we playing with, Image or File
if ($_GET['type'] == 'Images') {
 $baseUrl = $Session->read('path_to_dest_image');
 $baseDir = $Session->read('path_to_destsvr_image'); 
} else  /* if ($_GET['type'] == 'File') */ {
 // File is the default
 $baseUrl = $Session->read('path_to_dest_file');
 $baseDir = $Session->read('path_to_destsvr_file');
}


And that was way too easy. You can now use whatever type of session handling you want, where it be PHP or database, and the information can be shared between CKFinder and CakePHP. I even may use this method to integrate CakePHP with legacy code, as I have previously done with CodeIgniter.

The hard part is going to be integrating Authentication, and using CakePHP ACLs to define CKFinder ACLs. And I'll do a posting on that, just as soon as I work out how to do it.

-- edit: Added in the bit about converting actually use CakePHP database sessions.

Series Index : Part 1, Part 2, Part 3, Part 4, Part 5, Part 6