Skip navigation

Category Archives: Adobe

There’s a lot of debates floating around the internet over the iPhone and Flash. Here’s an example. A lot of it is rather frustrating since it is miered in rhetoric. While most people won’t read this – its probably worth writing down for posterities sake. Having actually played with Flash development tools (not so much the authoring tool that comes with CS4) I thought it worth commenting on.

1. Flash is unstable

I honestly cannot remember the last time it crashed for me. I guess it might be unstable if your machine has several not up to date versions of the runtime installed.

2. Flash eats the battery alive

Maybe on the Mac which doesn’t have support for hardware acceleration. When I used to use a Mac as my main machine I never really did understand this – why Apple wouldn’t make the API public for 3rd party apps to use hardware acceleration on the video card. There apps can use it (Safari and the DVD player – but that is about it), but 3rd party apps cannot. In fact if you want to play back video on OSX you have to do so over the Quicktime libraries – which don’t support hardware acceleration. Maybe Apple does this deliberately to cripple Flash.

As a matter of fact – Hulu running in HD mode on my ancient Dell Optiplex 745 uses around 15-20% of the cpu resources – where playing video on my Mac in any program outside of Safari uses over 95% of the cpu, not just in Flash. Sounds like Steve should blame is OS engineers for the cpu time wasted.

No the reality is that Flash is an application platform these days – not at all unlike Java (which has also been barred from the iPhone). If Flash ran on the iPhone it would open up the device to applications that ran on it over the web without app-store approval and revenue. I suspect it’s as simple as that.


I noticed a good article over on that discusses some of the changes happening in World of Warcraft. It’s worth reading even if you aren’t into video games because software developers from all spectrums run into the same problem to a lesser extent. The article touches on a couple of things really, but three things stand out to me. First is that the major design decisions have been handed off to another team at Blizzard as last February. This was apparent at last years Blizzcon when the designers for the dungeons and raids panel didn’t feature Jeff Kaplan – former lead designer of the game, and a game designer for Everquest, but a relatively new team (to me at least).

Second – game design and balance on a live game is really hard because of the human factor. Interestingly enough several design decisions made a huge amount of difference in several of the professions I had worked on in game. One example:

Alchemy: Alchemists make potions to buff players, restore mana and health. They also do other things like transmutes (changing one resource into another resource) and combat elixirs (basically buff potions that increase your abilities). Elixirs and transmutations are the most expensive to do and have limitations as a method to balance the profession against the game and the economy (for instance – alchemists can only do one transmutation every 24 hours). In the last expansion (Burning Crusade) they introduced 3 specializations you could pick: potions, elixirs and transmutations – basically all it did was when you made a potion as an alchemist you’d sometimes get 2-4 extra ones for free. If you did a transmutation you’d sometimes get 5 extra from one item transmuted etc. Well I picked potions, because healers (who tend to use a lot of mana) were going through them like no-ones business and it was very lucrative.

So sounds good so far right? Well then they changed the way healing works in the game. They no longer made it beneficial to downrank healing spells (in other words – use lower level spells that consume less mana) – which increased mana consumption. This part sounds good to a potion specialist right? Well… they increased player mana regeneration in the game, and gave several classes replenishment abilities that help restore mana to the party your player is in. With a properly constructed raid its actually very hard to run out of mana without doing anything special.

It gets worse too – they introduced a hidden debuff (wasn’t actually hidden in the beta) called potion sickness. In combat you can really only take one restorative potion now. When the character is in combat different game mechanics take effect (you don’t regen mana as quickly, you cannot revive someone from death – stuff like that). In a single raid at most you could drink 11-12 potions (one for each boss), but like I said you probably won’t need to when everything is going smooth.

The net effect is – no-one really needs stacks and stacks of potions for raids anymore – pretty much the only thing that is essential is stacks of Elixers/Flasks. I made 40 mana potions for my shaman last winter and I still have 30+ left… All this because some game designer decided that in game raids were too consumable driven (whoever has the best buffs wins). Even funnier – the older style potions I still have stacks of as well because I just haven’t gone through them like I used to.

Most players probably like this decision. I did for the most part, right up until I realized all the effort put into making my warlock a potion master was pretty much wasted. Now I have to pay some NPC a bunch of money to unlearn my specialization and start a new one – which is a pain actually because the quest is material intensive and it involves materials only found in older zones (2 of the specializations require runs through older dungeons) – which means lots of grinding. Sounds like a hassle.

My druid ran into the leatherworking problem mentioned in the article as well – I have loads of patterns in my leatherworking notebook for quivers and ammo bags, but its been a good long while since any hunter wanted me to make one.

Keep in mind – all of this isn’t necessarily a bad thing. In WoW they have gone from one or two guilds who could complete raid instances to easily a dozen, but you can quickly see how positive changes can affect the product in a negative way.

Someday I can talk in detail about similar decisions about positive changes to Acrobat that affected users in a unforeseen way. The issue with MDI vs. SDI for example (since its public knowledge). I really can’t tell you how many bugs were with one, but not the other (good example – play with annotations on two different PDF files with the properties bar on…) and the extent having this option increased testing time (you would have to do all UI tests on all 30+ supported platforms twice as an example). The major issue is its a deprecated feature in Windows – there are platform bugs as well with MDI that will NEVER be fixed by Microsoft. If I were a product designer the way I’d address this is keep SDI, but introduce document tabs similar to Firefox. This may not be a good solution either as it would introduce a whole new set of problems and it would involve a lot of engineering effort to write a whole new window handler. A quick search on Google and there are still plenty of people upset about all this. Alas – not my issue anymore :).

With FrameMaker 8 :). Check out this screenshot I made,

Not a trick or a hack, but native unicode (utf-8) support.

This is Lake Union just outside the Adobe Seattle office.

I was testing a FrameMaker issue the other day – which involved growing the fntcache.dat file.

The fntcache.dat is inside the \windows\system32 directory and according to a Microsoft engineer I talked to that file is a bitmap cache used by the GDI. The way it works is once a font in a particular size has been used its cached as a black and white image inside that file. This makes Windows a bit quicker in rasterizing fonts on screen.

We found when this file gets too big (I’m guessing around 2 megabytes) Frame starts dropping text in PDF output. It’s a rather annoying problem that we have known about for a while. There’s even a public document about this issue with some things that may or may not solve it.

So how do you make this file bigger? Add fonts to the system!

So how does the system work after adding all these fonts? Actually I can’t tell the difference too much. Some applications take a bit longer to load (enumerating all those fonts I suspect), but the machine doesn’t take any longer to boot and its not sluggish or anything. Computer is a 3 GHz P4 with 1 gig of ram.

After doing this I was able to reproduce the problem.