Wednesday, July 28, 2010

Starcraft II : First Impressions

Every man and his dog has probably done a StarCraft II : Wing of Liberty review when they got their hands on a beta key. But given that I didn't have my beta indicators on at the appropriate time, and are particularly craptacular at Real Time Strategy games, I'm posting my first impressions review one the second day of release.

So far, I've only taken part in the campaign. I'm a big scaredy cat, and don't like pitting myself against real people unless I know the tools at hand. And given that I haven't played SCI since at least 1997 (and I might have even stopped playing in 1996), I'll stick to the campaign until I've been through a fair whack of the single player mode.

So, after the initial installation last night, and adding the game to my account, I quickly went through the tutorials. They're great. They give a nice quick rundown on how to move, attack, build, do supply and the like. They're also a good introduction to some of the keyboard layout.

Keyboard layout. So far, it's okay, but I think I may end up remapping keys. I'm very used to playing keyboard and mouse, and usually run with an ASDW setup for movement and strafing with the occasional SDFE, depending on the number of abilities a particular character may have. SCII has alot of these keys mapped to unit abilities on the character card (I think they called it that, or was it game card. The bottom right panel, anyway), and you either use mouse, arrows or the minimum to move your view port. Swapping my left hand from arrows to keys (which are mostly on the left side of the keyboard, especially A for Attack) can be a little frustrating. Maybe A will get a special mapping on my mouse.

After completing the tutorials, it was in with the campaign. I played the first one three times, to cover off normal, achievements and hard mode achievements. As a testament to my crap playing, only Raynor and a bunch of the civvies survived the Hard mode, so I didn't bother with Brutal mode.

The between game interface is really well done. I had a bit of fun playing the arcade mini-game in the cantina and noticed the dance of the hologram was based on the Night Elf dance model.

Iffy things were the achievements server and RealID.

Part way through the infestation quest, I got a notification to say that achievements would not be available until further notice. At this point, I'm not playing the game for achievements. That's something I'll go back and do later. In a few weeks or months, the achievements server should be more stable. However, that would be quite annoying if I were that type of playing, and it kept going offline.

Also, since I'm located in Australia and have a SEA version of the game, I had to add my RealID friends again, and I don't get the cross game communications with friends playing WoW. Blizzard have said that SEA players will have the option of installing the North American version of the game so we can play against friends in the US, or just have cross communications with people playing WoW (on US servers). I think Blizzard have missed the mark with RealID for people in the Oceanic region. It becomes apparent that their authentication servers and chat servers are tied together, and even though there might be some replication between authentication servers in North America and SEA. I guess if I didn't play WoW at all, and didn't know anyone playing WoW, it wouldn't matter. It just seems like a bit of shiny has been tarnished. It's a small thing and has not lasting impact on the actual game. If anything, it's commentary on Blizzards effort to break into the social networking scene. And for the record, I'll not be linking SCII with Facebook, ever.

Anyway, I want to end this first impression on a happy note, since I'm very happy with the game. Having upgrade choices, mercenary hires and research trees that don't have to be selecting in the heat of battle is a big plus. Having interactive features between games to progress the storyline is also good.

Somehow, this review seems incomplete, but I've got other stuff to do (like work). Maybe I'll post some more thoughts when I've taken part in the multi player side of things.

Tuesday, July 27, 2010

Starcraft II in my hot little hands.

In Australia, Starcraft II : Wings of Liberty is rated M.

That's M for mature. And that's me, right?


Monday, July 26, 2010

Regretting Codeigniter

Once upon a time there was a legacy PHP application. It was written with no particular framework in mind, and no particular structure to where code was situated. For a while, it even used ADO DB, ODBC and MySQL to access the same database. Yeck.

Then one day, a developer came along. He tried to make improvements to the application, to make it easier to maintain and to make it easier to add new functionality to the product. He normalized database access to just use MySQL, and tried to at least arrange business functionality into libraries, with similar functions grouped together.

This was fine for a while, but things were still rather haphazard, and the developer knew there was plenty of room for improvement.

He decided to impose an MVC framework on the legacy application, to help with new developments. But which MVC framework to use, he was not sure. He had to make sure he could to session and authentication from both the legacy code and the new MVC code. He also had to make sure that the framework models had to be flexible enough to handle the legacy database, without having to visit every program in the system.

The developer was new to MVC in PHP. He'd worked with MVC in Java, and had a pretty good idea how it was supposed to work, but applying the same principals to PHP was going to be interesting.

He tried Zend for a start, and managed to get authentication and shared sessions working, but Zend was just messy to read. The elongated class names hurt his eyes, and threatened to turn one line of code into several, just because of where the class lived in the code structure.

While he tried to decided, he worked on other PHP projects. These projects were fresh-ish. They had the opportunity to start again, but were often based on previous projects to try and get some code reused.

One of the projects used SMARTY Templates. SMARTY was but one part of the MVC equation, but the way the developer ended up using it was very nasty. SMARTY also introduced it's own mini language, which was supposed to introduce a degree of independence from PHP. But nonetheless, it was another language, even it is wasn't PHP, and would still present a problem for non programmers, so why bother. Never again would the developer touch a template system not already included in a MVC framework, and never again would he touch one that tried to introduce a separate presentation language, when the native one was perfectly serviceable.

For another project, he tried CakePHP. CakePHP was very young at the time, but seems to have some reasonable documentation and a community that was excited about what it offered and a development team that was excited about delivering that offering. CakePHP was yummy and great for fresh projects. Favouring convention over configuration, and tools to automagically generate code for models, views and controllers, puuting together the basics for even large projects was fairly easy. CakePHP even allowed some configuration of models to allow for legacy databases. The only downside was it's session and authentication management. Tried as he might, the developer could not find a way to simply include hooks for his legacy application into the CakePHP framework to share session and authentication data.

Finally, the developer found Codeigniter (CI). CI was also MVC based, and also seemed to have a community and development team that was excited about the framework, as the CakePHP community was excited about theirs. Plus, CIs models were super flexible, and used Active Record for database access. This would allow the developer to handle all the weird intricacies of the legacy database. Confident in his choice, the developer integrated CI with his legacy application that would allow access to the session and authentication functions of the legacy system, and allow new parts of the system to be developed with MVC in mind.

The developer passed the new version of the product to his team members, and off they went. The legacy application would live a few more years and have new life breathed into it.

The developer went back to his CakePHP projects, and loved them so. He spent alot of time, alot of good times writing new functionality, quickly, easily, cleanly.

Then one day he has reason to revisit the legacy application and CI, and it was not as he would have liked. He discovered that Helper Libraries weren't actually classes. They were just collections of regular functions and could not be overridden the was a normal class could. He discovered that the flexibility of the model was like being offered flour, water and yeast and being told it was bread. There was still also of repetitious work to be done, and it made him yearn for CakePHP.

Months later, the developer sits alone. His team has gone and he must live with his choice. He once went looking for an ORM tool to make modeling, and especially modeling relationships easier. He thought he found a savior in DataMapper OverZealous Edition, but it was not flexible enough to handle the legacy database. He has not looked at Doctrine yet, but he's not sure he wants to. Anything less than CakePHP is just flour, water, yeast and sugar that tastes like the salt in his tears.

The developer has considered Lithium. Not in the medicinal sense, silly, but another MVC framework, born of CakePHP but with a different philosophy of being lightweight and taking advantage of advances in the PHP language. Unfortunately, it is because Lithium will only serve PHP 5.3 the developer can not use it. He must support 5.2 for the sake of his legacy application.

The developer has even found a way to integrate newer versions of CakePHP with legacy code, so the session and authentication information can be shared. He's just not sure if he can safely integrate his legacy PHP application, CI and CakePHP all in one product. But the more he things about it, the more he thinks he must try.

The developers sadness flashes red with anger. He regrets choosing CodeIgniter. He'll ignite the code, all right. Ignite the code and make cake from its ashes.

Tuesday, July 20, 2010

Hello, Mister Seven. Part 2

Well, Windows 7 64-bit is finally installed, but not without some frustration.  I should point out from the get go, that this is a Windows 7 Home Premium 64bit upgrade from Windows Vista 32bit Ultimate OEM.

The story unfolds in point form:
  • Get home from tae kwon do.
  • Install 8GB of new RAM.
  • Install new 1TB SATA HDD, making sure it's level in the case.
  • Scramble in my box of old computer shit for a SATA data cable.
  • Unplug the old SATA disk, and plug the new one as primary.
  • Start the machine.
  • Put the Win7 64bit disk in the DVD tray.
  • Kick off the Windows 7 install, via a Custom Install (so far, so good).
  • Enter a username, enter a password, enter a Product Key
  • Invalid Product Key.
WTF?  Why the face?  Why the FACE!?  Because WHAT THE FUCK!?  I got very shirty very quickly.  I tried my local IT expert (my girlfriend).  I tried her local IT export (ex-co worker). I tried the internet.  There's alot of shitty information out there about upgrading from Vista 32bit to Win7 64bit.

Here's what I ended up doing, but you can probably cut straight to the bit that actually works.  I plugged the original HDD in as the primary, and the new one as the secondary.  I downloaded and ran the MS Upgrade Advisor, which told me that my install was good to upgrade to 64bit, though it would require a custom install, as I had already done.  I then googled for "win7 product key invalid", and found a tidbit that mentioned the upgrade product key would be recognised as invalid if it could not find an older version.  It also said if you needed to reinstall Win7 at anytime, you would need to reinstall the original product you upgraded from, and then do the Win7 upgrade.  God help you, if you're not upgrading to a new HDD.

Anyway, I shutdown and swapped the SATA connections so my new HDD was the primary and my old one was the secondary.  I figured that perhaps the Windows Anytime Upgrade software on the install disks might scan the other HDD and recognise the old Vista installation.  Approx 20-30min later, I was back to putting my product key in, and away it went.  Yay!

The one thing I was pleasantly surprised with was that only a couple of Windows updates were required (well, 5 downloads to be precise), it it looks like it has also install the NVIDIA control panel for me.  I downloaded the latest version, just to be sure.

Tomorrow night, I will tackle installing things like WoW, the Curse Client, Steam and a bunch of other convenience stuff.

Monday, July 19, 2010

Bye-bye, Mister Vista. Hello, Mister Seven.

  • 8GB of DDR2 memory. Check
  • 1TB HDD. Check.
  • Win7 Home Premium upgrade. Check.

Tonight, the old HDD gets unplugged, the memory gets replaced and my old Vista machine starts along the road to becoming a Windows 7 machine, 64 bit and 8GB of memory to have its way with.

The reinstall is going to take a while, I know.  And eventually downloading a few of my old games (looking at you, Steam) is going to take a while, too.

On the other hand, I'm less likely to have out of memory errors in WoW, as I play that, and do a few other things on the side.  And it's a way to extend the lifetime of the rest of the machine, without buying a whole new one.

I've also got a Sharkoon QuickPort combo eSATA on order as well.  This will come in handy for a few IDE drives that I may want to quickly check, without having to get a dedicated external case for.

Wednesday, July 14, 2010

A Git for my sanity

A few months ago, I converted all my SVN projects at work to Git. This was mostly due to the messiness that is SVN properties when doing merges. I was hoping Git was going to cut down on that, and it has somewhat, but I feel I'm missing something.

Here's a short description of one of our products. The state of play when we moved from SVN to Git was a bit shakey, as far as branches are concerned, so only after the next release will Git be working like I want it to, and how it should.

For now, I've got three "branches" to deal with.

The first branch is my "prod" branch. It's actually called "prod-20100612_email_paed_recalls", and it's a bugfix branch for what is currently live on the one site that I have to maintain it on (lucky me, it's a custom application, so I control releases to site, etc). In perfect Git work, this should actually be the "master", but for now it's not.

The next branch is the "master" branch. At the moment, it contains the next branch of work that will be delivered in the next release. I deliver that to an external environment called Staging, where the clients get to test before it all goes live. When it does go live, then I can do away with the prod branch, and use the master branch how it was meant to be used. One thing I'm looking forward to in this branch is code delivery via Capistrano.

And my final branch is the "dev" branch. It's actually called "dev-20100712-mo" and contains the next batch of features currently in development for after the current "master" branch goes live. Fortunately, this branch was cut from the master, so at least there's properly maintained history.

Now, here's the workflow for when I have to update any of these branches. If you've got any suggestions as to how to do them better, I'm welcoming any suggestions. I'll probably even post something to Stack Overflow, asking for help.

Oh yes, I should mention. Because I have to make regular updates in each of these branches, I've decided to have a clone of each one. I also have a remote called origin, where they all get pushed to, and backed up.

Workflow One: The Bugfix for Production.

So, I end up making a change to the "prod" branch because of some production branch (suffering alot of "this was here, now it's not" issues because forward migration in SVN was a PITA).

Once I've committed the change in prod, I go to my master directory, and do the following:

$ git checkout prod
$ git pull
$ git checkout master
$ git cherry-pick XXXXXXXX
$ git push

Because prod and master don't have the nice branch relationship that master and dev to, I have to cherry-pick my changes across. I'm okay with this. I haven't tried it yet, but I'm hoping to streamline this down to (from master):

$ git pull
$ git cherry-pick XXXXXXXX
$ git push

Next step, is making sure this production fix makes into the latest development release, so over I go to my dev directory. And I do:

$ git checkout master
$ git pull
$ git checkout dev
$ git merge master
$ git mergetool && git commit # usually
$ git push

The problem with this is I always end up with normal merge conflicts, for which I have to run the mergetool. SVN was able to deal with these with no prompting, so this is a step for which I feel I'm missing something.

Possibly a better way to deal with it is (from the dev branch):

$ git pull
$ git merge origin/master
$ git mergetool && git commit # if needed at all
$ git push

The only downside is the master in the dev directory gets out of date. Theoretically, it shouldn't be a problem, it should just be a matter of checking out the master branch and doing a pull. It hasn't really worked out that well, so far.

Workflow Two: The Bugfix for Staging.

This is pretty much the same as the first workflow, except I only have to deal with the master and dev directories, and there's no cherry picking involves.

Workflow Three: Development.

Well, this is essentially the last step of the last two workflows. I do some development, commit the work, push it to origin so it gets backed up (and one day, another developer may get to play with it), and then I want to pick up any changes that have happened in the master so they don't get lost ("they" being bugfixes to either production or staging).

I just hate seeing every second line in my dev branch gitk view reading "Merge branch 'master' into dev-20100712-mo" (or "Merge remote branch 'origin/master' into dev-20100712-mo", for the one occasion I've tried so far).

I'd rather update sooner and often, and I feel that's the right thing to be doing, but it just "looks" messy, and I'm not sure that leaving the merging of master back into dev to pick up bug fixes on the near completion of the dev feature branch is the right thing to do.

Any helpful hints for a seasoned Git user?

(Now just got to whittle that down to something consumable for Stack Overflow).

Friday, July 9, 2010

Fun and Games with onsubmit()

I've been doing programming for the browser interface for quite a few years now (maybe as far back as 1998, using Progress WebSpeed 2.X), and still, I'm learning things all the time.

Todays lesson was the form event, onsubmit().

With some actions in a web page, you want to make sure that some things are really meant to happen, for example, if you are deleting an entity from a back end database. The first defense (after authentication and access control lists) against accidental actions is making sure the delete gets issued as a POST, rather than a GET. And if you're paranoid enough, the next step is getting a confirmation from the user, since some users are still prone to jittery fingers and random clickiness.

So, I have this page where timesheets can be submitted for approval, deleted, and a bunch of other actions. Actions like approval, rejection, submission and deletion all have a confirmation, using the javascript confirm() function.

Side note: Is that the right spelling? Delete is to deletion, as submit is to submi.....?

And all the forms asking these confirmations had them set up on the onsubmit event handle for the forms in question.

Now, my menu structure is a bit special, so for the most part, there is not an actual form to fill out. When the user selects the Submit action from the menu, I programmatically trigger the submit event on the form, via submit(), and would expect the onsubmit handler to be called, run the confirmation and either submit or not submit based on the user selection of Yes or No, Okay or Cancel, or whatever confirm() does in your flavour of browser.

It turns out, onsubmit() is not supposed to be trigger programmatically. It's only supposed to be triggered if there is a submit button in the form, and a human selects that submit button. In my experience with Firefox 3.6.6 and Chrome 5.0.375.99, the onsubmit() function gets run, but the return value of the event handler is totally ignored.

This pretty much makes onsubmit quite useless as a catchall for validating a form, regardless of how the form is submitted.

In the end, I created a regular Javascript function for each confirmation type (i.e. confirmSubmit(), confirmApprove(), etc), and placed onclick events on each menu link to call that function. (i.e. onclick="confirmSubmit(); return false;").

In the function, I can call confirm() and then call form.submit() based on the results.

I've compromised with my programming principals for this one, since there doesn't seem any other option that is as simple to maintain. The only saving grace is that these forms do not have a visible aspect on the screen. The other forms on the page that actually take user input are making extensive usage of jQuery, dialog and validate.

Tuesday, July 6, 2010

Using Prolog for Cataclysmic Purposes

I wasn't too sure which blog to put this under, but I think I'll make this the main entry, and reference from my other one.

There's a World of Warcraft expansion coming up later this year, and my host of WoW toons will be making the journey from level 80 to 85, plus one extra Worgen Druid experiencing Azeroth all over again, from level 1. I've got a couple of other high level characters on other realms, but unless Blizzard do something about the 10 character limit per realm, I don't imagine they'll be doing much.

Anyway, one of the items facing altoholics in the expansion is which character to level to 85 first, then which second, etc. There are many factors to consider, such as profession synergies, money making abilities, survivability and the fun factor. I think the fun factor will win out for me, but it's very handy to have a host of toons which which you can craft weapons, armor, gems and elixirs.

Anyway, my inner nerd remembered a tool that I leaned back in high school that would be just perfect for a job such as this: Prolog. So I tottled off to download Visual Prolog, but since this is just an idle experiment in decision making, I was happy enough to use PIE (Prolog Inference Engine), available within the free examples. PIE most closely resembles what I worked with in high school, without having to get bogged down by all the Windowsy stuff that seems to take more focus in the tutorials than it's worth (or maybe it is worth it, but I just wanted to dive into the stuff I remembers, instead of putting a Window and menu together).

Anyway, a few short Prolog statements later, and I've come up with some basic facts to help determine what I might want to level first.

This set of facts should be common to all WoW characters, but I'm not sure if it's exhaustive. It certain doesn't take advantage of tailors being able to gather extra cloth, or enchanters being able to provide enchants.





The next facts give the state of play for my toons on Aman'Thul.



And these handy statements let me discover stuff that I could have worked out on paper, but would rather marvel at the revival of my high school level Prolog skills.

wears(Character, ArmorClass) :-
toon(Character, Class),
armorClass(Class, ArmorClass).

gathersFor(Gatherer, Crafter) :-
hasProfession(Gatherer, GatherProfession),
gathers(GatherProfession, Material),
requires(CraftProfession, Material),
hasProfession(Crafter, CraftProfession).

craftsFor(Crafter, Character) :-
hasProfession(Crafter, CraftProfession),
crafts(CraftProfession, ArmorClass),
wears(Character, ArmorClass).

With these basic statements, I can see who my most useful gatherers are, for leveling professions (gathersFor(X,Y).), and who can support each other for crafting gear (craftsFor(X,Y).).

Sunday, July 4, 2010

Integrating CKEditor and CakePHP : Part 6

In Part 5 of the series, I covered how to check the CakePHP session from CKFinder to see if there was an authenticated user. There were a couple of shortcuts taken, so in this article we look at using the Auth component to check for authentication. As yet, this still does not cover ACLs in CKFinder.

Now, the previous method had CKFinder take advantage of inside information, that the authenticated user was stored in the session, and that the authentication model was the User model. We were also assuming that just because you had a session, it also meant you were allowed to upload files. While this might be try for some controllers, it might not be true for all.

While I was trying to work out how CakePHP authentication could be integrated into CKFinder, I got the chance to take another closer look at how CakePHP authentication works.

In my example code, there's very little setup for authentication, while means that as long as I have an authenticated session, I am considered authenticated for all controllers. However, this might change if I use the controller method of authentication, where each controller overrides the isAuthorized() function to determine is a user is allowed access to any actions on that controller. Or I might use the actions method of authentication, where ACLs are used to determine if a user is allowed to access a particular action (this has been my preferred method of authentication and access control, to date). There are other methods such as CRUD (ACL on actions mapped to create, read, update and delete classes), object (isAuthorized() function on any object) and model (like object, but just for models).

Suggested read for the brief overview is the CakePHP Book section on Authentication, and for a more detailed view, go straight to the AuthComponent, but look at the source, as the summary of methods might be a bit too vague.

Anyway, to integrate with CakePHP Authentication, we have to make use of the Authentication Component. Components usually expect a controller, and in this case, a controller and an action are needed. For our simple authentication, we going to say "if you're authenticated to access the contents/edit page, then you're allowed to use CKFinder". To do this, we need to communicate to CKFinder the controller and action we'll be working with, which is done via the ckeditor element. Update it with these new session variables after setting up the other session variables.

$_SESSION['controller_name'] = $this->name;
$_SESSION['controller_action'] = $this->action;

And then we update the vendors/ckfinder/ to pick out the controller and action, start up the controller and use the Auth component to determine if the user in the session is authenticated to access the controller and the action.

 * This file is included by the CKFinder config.php, and will
 * set up the basePath and permissions, as is specific for the project
define('EXTERNAL_APP', true);
// starts from app/webroot/ckfinder/core/connector/php/connector.php
include_once '../../../../../index.php'; // targetting app/webroot/index.php
App::import('Core', 'CakeSession');
$Session = new CakeSession();
// What resource type are we playing with, Image or File
if (isset($_GET['type']) && $_GET['type'] == 'Images') {
 $baseUrl = $Session->read('path_to_dest_image');
 $baseDir = $Session->read('path_to_destsvr_image'); 
} else  /* if ($_GET['type'] == 'File') */ {
 // File is the default
 $baseUrl = $Session->read('path_to_dest_file');
 $baseDir = $Session->read('path_to_destsvr_file');
function CheckAuthentication() {
 $Session = new CakeSession();
 $controllerName = $Session->read('controller_name');
 $controllerClass = Inflector::camelize($controllerName).'Controller';
 $controllerAction = $Session->read('controller_action');
 $params = array('action' => $controllerAction);
 $controller = new $controllerClass();
 $controller->params = $params;
 $controller->action = $params['action'];
 return !is_null($controller->Auth->user());

You might notice a couple of improvements to the code. Because we're properly bootstrapping CakePHP, we now have access to the App::import() function to include CakePHP classes. There's also a chance that CKFinder will not pass the "type" on the URL, as we make sure it is set before doing anything.

Now, it this point, I wonder if there's any need to try and make use of CKFinder ACLs. Now that we have CakePHP Authentication integrated, we also have CakePHP ACLs integrated by proxy. There is one case where you might want to make use of CKFinder ACLs, and that is where you want to differentiate CRUD actions on files from CRUD actions on folders. This might be harder to set up and have feeding from CakePHP in a non-gimmicky way, since CakePHP ACLs deal with one access point for a single check, whereas CKFinder ACLs are dealing with two, files and folders.

At this point, I've just got vague suggestions for that integration. If you have set up ACLs in CakePHP, and are using groups or roles, then set the role of the user in the ckfinder element, then extract this in the vendors/ckfinder/ and populate the $config['RoleSessionVar']. Though I'm not too sure how accurate it would be to call it a role, since a user may have multiple roles, and from a look at the ckfinder/config.php, you only get to choose one.

For the moment, that concludes my series on integrating CakePHP and CKFinder via CKEditor. I hope you found it educational. It was certainly an interesting journey for me, especially being able to bootstrap CakePHP into legacy code.

Series Index : Part 1, Part 2, Part 3, Part 4, Part 5, Part 6

Saturday, July 3, 2010

Integrating CKEditor and CakePHP : Part 5

In Part 4 of the session, I covered how to integrate CakePHP database session into CKFinder. In this article, I will cover how to integrate CakePHP Authentication.

This is only going to cover basic authentication for the moment. No tricky ACLs, and no roles.

First, we'll need to set up our application to have someone log in. I created a users table with id, username and password. Let the baking ensue!

Because I'm aiming for the simplest set up, and least amount of code, I followed the Authentication example over from the CakePHP Authentication documentation. Namely, use the Auth and Session components in the AppController, provide login and logout actions on the Users controller, and a login view. Once you've set this up, you may wish to temporarily set the allowActions on the Users controller to '*', so you can set up at least one user entry. Then return the allowedActions back to array('login','logout').

Now we can tie in CakePHP Authentication with our CKFinder configuration file.

Comment out or remove the CheckAuthentication() function from ckfinder/config.php, and add it to vendors/ckfinder/

function CheckAuthentication() {
    $Session = new CakeSession();
    return $Session->check('Auth.User');

There are multiple ways to check for a valid authenticated session, but this is the easiest. Of course, it makes an assumption that your authentication model is the User table.

You may wish to use the Auth component directly, and use $Auth->user() to retrieve the user details. I'll probably use that mention when I take a look at integrating ACLs.

Series Index : Part 1, Part 2, Part 3, Part 4, Part 5, Part 6

Friday, July 2, 2010

Integrating CKEditor and CakePHP : Part 4

In Part 3 of the series, I covered how to use the session to communicate from CakePHP to CKFinder the directories in which to browse and upload files. However, that was assuming the usage of PHP sessions in CakePHP. This article will cover how to use CakePHP database sessions.

The goal of this is to write as little code as possible, and make the most out of what has already been done. The term Don't Repeat Yourself (DRY) should be familiar to most (CakePHP) programmers.

So I took a close look at what I could do to make use of the CakePHP sessions. I don't need to start a new one, I just need to access one that might already be there.

Since we already have information getting written to out session via the ckeditor element, I don't need to make any changes there. Although I could probably change the values of path_to_destsvr_image and path_to_destsvr_file to use the DS definition, and it probably wouldn't be a bad thing to use the Session component or CakeSession library (however, as yet, I'm still using $_SESSION).

All the action is going to happen in our app/vendors/ckfinder/ file. But first, lets convert the app to use database sessions!

First thing to do is add the cake_sessions table to the database. My CakePHP command lines are a little whacky, but here's what I did.

$ cd app
$ php ..\cake\console\cake.php schema create sessions

That's not quite what they want you to do in the CakePHP 1.3 manual, but it's what I did, and it installed the new table just dandy.

The next thing is to update the CakePHP Core to use the database.

Configure::write('', 'database');
Configure::write('Session.database', 'default');

Hey presto! CakePHP database sessions.

What we want to do is bootstrap into CakePHP from the, without firing off the CakePHP Dispatcher. So I've made a small change to app/webroot/index.php that will prevent the dispatcher from being run in the presence of a defined variable, EXTERNAL_APP. Almost at the end of the file, make this change:

if (isset($_GET['url']) && $_GET['url'] === 'favicon.ico') {
} else if (!defined('EXTERNAL_APP')) {
    $Dispatcher = new Dispatcher();

Now we can update the vendors/ckeditor/ to access the CakeSession object.

 * This file is included by the CKFinder config.php, and will
 * set up the basePath and permissions, as is specific for the project
define('EXTERNAL_APP', true);
// starts from app/webroot/ckfinder/core/connector/php/connector.php
include_once '../../../../../index.php'; // targetting app/webroot/index.php
if (!class_exists('cakesession')) {
 require LIBS . 'cake_session.php';
$Session = new CakeSession();
// What resource type are we playing with, Image or File
if ($_GET['type'] == 'Images') {
 $baseUrl = $Session->read('path_to_dest_image');
 $baseDir = $Session->read('path_to_destsvr_image'); 
} else  /* if ($_GET['type'] == 'File') */ {
 // File is the default
 $baseUrl = $Session->read('path_to_dest_file');
 $baseDir = $Session->read('path_to_destsvr_file');

And that was way too easy. You can now use whatever type of session handling you want, where it be PHP or database, and the information can be shared between CKFinder and CakePHP. I even may use this method to integrate CakePHP with legacy code, as I have previously done with CodeIgniter.

The hard part is going to be integrating Authentication, and using CakePHP ACLs to define CKFinder ACLs. And I'll do a posting on that, just as soon as I work out how to do it.

-- edit: Added in the bit about converting actually use CakePHP database sessions.

Series Index : Part 1, Part 2, Part 3, Part 4, Part 5, Part 6

Thursday, July 1, 2010

Integrating CKEditor and CakePHP : Part 3

In Part 2 of this series, I covered how to add CKFinder to your CakePHP and CKEditor installation. This article will cover a particular requirement to upload files to a specific directory based on the content identifier.

In my TinyMCE solutions, I use cookies to communicate to the TinyMCE uploader which directory it should upload files to, and what the relative URL is for that location. In hindsight, it's probably not so secure, because we end up sending absolute paths in the cookie data to the client. So for this time around, I'm going to use the PHP Session, as generated by CakePHP. It will assume that we're using basic PHP Sessions, as defined by using the value "php" for our core config item "".

With regards to the requirement for uploading to specific directories, I usually have a publish behaviour attached to most of my editable models, where the user can make changed to an unpublished version, but it's not made apparent on the website until it's published. Going along with that idea are also any images or files that are uploaded for use in the content, and they are also copied to a published version of the content item. I'm not going to cover the publish behaviour here, but I will cover uploading to a specific directory, so that all images are uploaded to app/webroot/files/img/contents/X and files to app/webroot/files/files/contents/X, where X is the id of your contents item.

The first change to make is for CakePHP to indicate to the browser new $baseUrl and $baseDir settings. We're going do this via the PHP session, and the ckeditor element. We also want to prevent uploading if it's a brand new content item, since an id will not have been created yet. Here's the revised code:

<script type="text/javascript">
$(function() {
if (isset($this->params['pass'][0])) {
 // only allow upload if we have a place to put it
    $_SESSION['path_to_dest_image'] = '/files/img/'.$this->name.'/'.$this->params['pass'][0].'/'; 
 $_SESSION['path_to_destsvr_image'] = WWW_ROOT.'files/img/'.$this->name.'/'.$this->params['pass'][0].'/'; 
 @mkdir($_SESSION['path_to_destsvr_image'], 0, true);
 $_SESSION['path_to_dest_file'] = '/files/files/'.$this->name.'/'.$this->params['pass'][0].'/'; 
 $_SESSION['path_to_destsvr_file'] = WWW_ROOT.'files/files/'.$this->name.'/'.$this->params['pass'][0].'/';
 @mkdir($_SESSION['path_to_destsvr_file'], 0, true);
 $('textarea').each(function() {
  var editor = $(this).ckeditorGet();
   {basePath: '/ckfinder/', 
   rememberLastFolder: false

What I've done here is to make sure we actually have an id to work with (via $this->params['pass'][0]) before we add CKFinder, and then add the session variables, and make sure the directory actually exists.

Secondly, I've created a separate file to contain my CKFinder config code. I like to keep that sort of thing out of the ckfinder directory, but also out of the main CakePHP app, so I've gone for app/vendors/ckfinder/ You'll also want to link that into the app/webroot/ckfinder/config.php file, somewhere near the top. Now, because all CKFinder PHP code is run relative to ckfinder/core/connector/php/connector.php, you'll want your config.php entry to look as follows:

include_once "../../../../../vendors/ckfinder/";

That's alot of parent directories!

In the, I make sure I tap into the CakePHP named session, start it, then set $baseUrl and $baseDir on the session values.



if ($_GET['type'] == 'Images') {
 $baseUrl = $_SESSION['path_to_dest_image'];
 $baseDir = $_SESSION['path_to_destsvr_image']; 
} else  /* if ($_GET['type'] == 'File') */ {
 // File is the default
 $baseUrl = $_SESSION['path_to_dest_file'];
 $baseDir = $_SESSION['path_to_destsvr_file'];

I haven't catered for the CKFinder Flash type, but if you wanted to, you could.

And one more change to the ckfinder/config.php. Since I have separate directories for images and files, I don't really want another subdirectory for the file types. So I've changed the ResourceType entries to just be $baseUrl and $baseDir for url and directory respectively, for Images and Files.

And I don't want thumbnails appearing in my CKFinder browser, so I've added "_thumbs" to the HideFolders config item.

And that's how I integrate CKFinder to use a specific upload directory via the CakePHP session, assuming you're using basic PHP Sessions.

However, that's not enough. I have some products that are using database CakePHP sessions. I'll cover how to deal with that in my next article.

Series Index : Part 1, Part 2, Part 3, Part 4, Part 5, Part 6

What's Old Is New Again

Having just completed a series of posts about integrating CakePHP and CKFinder, I was noodling around, and decided to take a close look at the website that provides the escaping tool that I use to prepare source for the blog. seems to be a great site for all things related to accessibility in the web world. However, the primary contributor has become a little disillusioned about the website, and if he should keep it going. I think it's great. It has a great selection of tools that I'll be taking a gander at over the coming weeks, and some interesting articles on web accessibility.

Accessibility on the web still seems to be a fringe discipline, and it's not until your country makes to take accessibility into account, you're working for the government or some other entity as large and bureaucratic as government or public service, or your target audience on the web is using a screenreader, that you actually bother to take a look at the issue in detail. Usually, you just try to keep the HTML clean, use CSS (and not the inline variety) and not use Javascript for generating content.

The more interesting articles on the site highlighted a tool that fills a need dear to my heart. Internet Explorer 8, Internet Explorer 7 and Internet Explorer 6, and having to provide for each of them (*cough* *bollocks* *cough*). And that tool is the Microsoft Expression Web SuperPreview for Windows Internet Explorer. Since the initial article was written, it looks as though this tool is available for free, though you can pay for the full version of Expression Web, which seems to allow comparison with Firefox as well. Hopefully it also does Chrome and Safari.

These days, I just don't cater for IE6 at all, if I can help it. It's bad enough that web development has to be visited at least once for each browser engine (Gecko, Webkit and WTF IE uses), but then I have to provide specific CSS for IE exceptions, and then do it twice again for IE7 and IE6. But this tool will help you compare a website between IE8, IE6 and the IE8 impression of IE7 side by side (or overlay, or horizontal tiling). It will also allow comparisons with images as well, just in case your website design has been delivered via an image (Photoshop, usually, but in rare annoying cases, JPG, PNG or GIF).

Anyway, now I don't need to maintain a VMs for the sole purpose of being able to run IE7 and IE6. Well, almost. There's still Javascript to test, but given my usage of jQuery, it's usually not such an issue.