In summary, Earth Live is an application that allows users to view global data on a 3D surface. It's maintained and updated daily by Discovery, who provides content about climate issues directly from scientists and NASA and NOA (in the "featured stories" area) and allows users to make their own global "stories" with data sets compiled from the last 24 hours (in the "create a story" area.) Any story, whether featured or created, can then be shared on a users blog, myspace, facebook, etc. in a widgetized form. Here's an example of the widget with no layers, which is how I like it best:
Earth Live was exciting to build and features some interesting technical hurdles. We wanted to incorporate an interface that was fun to use but still conveyed the data, and that was difficult because until about half way through the project the data itself wasn't well defined. I heard someone speaking at Max this summer say something that should be obvious: data comes first, interfaces second. Making an interface for data you don't have yet is always a risky thing. In this case I think we lucked out and the interface fits the data pretty darn well, but I still wish we'd seen it sooner.
We decided early on that the globe should be a real 3d object for a number of reasons. First, this was the way our sales guy pitched it to the client. His big idea was to mask a giant repeating flat map so that it APPEARED to be a real sphere - we in the development department were having none of that. If it was going to look like a globe, then by golly it was going to BE a globe. Second, our designers convinced us that 3d globes are just more fun to use. We'd used PaperVision3d on a few projects before and read about it's recently increased performance, so we decided to give it a go.
From the beginning Discovery had been interested in using this app to support their Earth Live blog, which tracks the adventures of scientists around the globe. We'd planned on building this feature out as clickable push-pins in the globe. The feature worked but with weird bugs around the edges in PaperVision. Though John Grden assures me this is not a bug, we made the switch to Away3D - a PaperVision spinoff project contriubted to by my buddy Peter Kapelyan - and everything was fine. Away 3D also gave us some increases in performance which were greatly appreciated.
We planned the application to display two kinds of data: flat image files and KML drawings. Flat image files were easy to do in PaperVision but the KML piece was more work. Brad Umbaugh, who did most of our globe development, spent about two weeks working on the KML piece, but in the end we decided it just didn't look right and so it's not being used. The functionality is in there, hoping and waiting for an eventual upgrade in version 2.0.
Somewhere around the beginning of October we started working with the real data and Discovery came to us with an exciting proposition: how about videos? In theory it worked, but in practice the performance sucked. It sucked bad - browser crashing, slow as molasses, 2 fps bad. Brad spent a few weeks tweaking it and with the help of some other EUI staff (Jim Cheng in particular) we came up with the following system:
- All of our videos are embedded in swfs.
- When we load a video into the app, we immediately strip all of it's frames out into raw image files during the first pass through the video.
- In the second pass, we flatten those images with any other layers on the globe at that time (though Discovery isn't currently layering any images on their videos, the capability exists.)
- After this, the globe updates it's wrapper material once every frame and it looks like a video. Performance is still rough on older machines, but it's decent on most and great on anything made in the last 3 years.
During the last month Kevin and Brad transitioned off and I worked bugs and updated the widget. I went through some hurdles learning the Facebook integration process, partially defined in a post here, and finally got it all working late last week. That's right - right before the deadline. You know how it goes. :)
All in all, this was a great project. Discovery is by far one of my favorite clients ever - they're amazingly enthusiastic and easy to work with. I also want to give major credit to Bobby Jamison and Jeremy Graston, who worked on things from the design angle, and Shannon Garret, who built all of our marketing materials including the Earth Live logo.
A final note to developers: the app currently throws a run time exception you might see in the debugger:
TypeError: Error #1034: Type Coercion failed: cannot convert flash.events::Event@27513d91 to mx.events.IndexChangedEvent.
This is an Adobe issue documented on their bug base here. I'd like to find the offending piece of code that dispatches the bug and catch it, but it's not a high priority now since it's only visible to users running the debug player -- non-developers will never see it. :)
2 comments:
Hi RJ,
Very very impressive work!
I have an issue concerning Flex performance that I was hoping you could help me out with. I would be grateful for any help or advice! :D
I'm currently a graduate student in computer science and chose Flex as the development platform for my project which goes live in a month. Part of the project requires displaying streaming MJPEG Video from a socket source. I have written a custom byte buffer, which queues up data from the socket until a full JPEG is present. When a JPEG is present, these bytes are written into a ByteArray and loaded with a Loader object (through loadBytes). This loader object is the child of one of two Image controls, which alternate into visibility for double buffering.
The issue I'm having is I cannot achieve more than 17 frames per second. Under a previous Java implementation, I was able to get 30 frames per second. Increasing the application framerate to 60 works (I can match the java performance), I was hoping for a cleaner solution. I have narrowed down the issue to the Loader.loadBytes method. The asynchronous event handling of the load completion significantly slows down the load time. Without the call, the system processes the jpeg bytes at over 30 frames per second. Of course, the jpegs aren't displayed then.
Do you know of any way to alleviate this bottleneck, like getting the ByteArray into the Image other than using loadBytes? Thanks in advance!
Hey Bryce,
I can think of a few different things to try, but I've never tried to do anything similar to this so I'm not sure if they'd work. If you'd like to send me the code, and if it's in a state I could debug through, I'd be happy to give it a shot...
rj[dot]owen[at]effectiveui[dot]com
There should be *some* way we can get Flex to view the byte array as a bitmap array itself without having to do the loadbytes method. Have you tried using the bitmap or bitmapdata classes at all? They're a bit lower-level than image - might be something good in there.
Like I said, if you want to send me over the code I could bounce the problem off of the team here. There are guys here with tons more experience manipulating image data than I have, and I bet we could come up with something for you. Let me know!
Post a Comment