*Coder Blog

Life, Technology, and Meteorology

Category: Seasonality (page 2 of 6)

New Disk

Having an application like Seasonality that relies upon online services requires those services to be reliable. This means any server I host has to be online as close to 100% of the time as possible. Website and email services are pretty easy to host out to a shared hosting provider for around $10-20/month. It’s inexpensive, and you can leave the server management to the hosting provider. For most software companies, this is as far as you need to go.

This also worked okay when Seasonality was simply grabbing some general data from various sources. As soon as I began supporting international locations, I stepped out of the bounds of shared hosting. The international forecasts need to be hosted on a pretty heavy-duty server. It pegs a CPU for about an hour to generate the forecasts, and the server updates the forecasts twice a day. Furthermore, the dataset is pretty large, so a fast disk subsystem is needed.

So I have a colocated server, which I’ve talked about before. It’s worked out pretty well until earlier this week when one of the 4 disks in the RAID died. Usually, when a disk in a RAID dies, the system should remain online and continue working (as long as you aren’t using RAID 0). In this situation, the server crashed though, and I was a bit puzzled as to why this occurred.

After doing some research, I found that the server most likely crashed because of an additional partition on the failed disk—a swap partition. When setting up the server, I configured swap across all four disks, with the hope that if I ever did go into swap a little bit it would be much faster than just killing a single disk with activity. The logic seemed good at the time, but looking back that was a really bad move. In the future, I’ll stick to having swap on just a single disk (probably the same one as the / partition) to reduce the chances of a system crash by 75%.

After getting a new disk overnighted from Newegg, I replaced the failed mechanism and added it back into the RAID, so the system is back up and running again.

This brings up the question of how likely something like this will happen in the future. The server is about 2 and a half years old, so disk failures happening at this age is reasonable, especially considering the substantial load on the disks on this server (blinky lights, all day long). At this point, I’m thinking of just replacing the other 3 disks. That way, I will have scheduled downtime instead of unexpected downtime. With the constantly dropping cost of storage, I’ll be able to replace the 300Gb disks with 750Gb models. It’s not that I actually need the extra space (the current 300s are only about half full), but I need at least 4 mechanisms to get acceptable database performance.

In the future, I will probably look toward getting hot-swappable storage. I’ve had to replace 2 disks now since I built the server, and to have the option of just sliding one disk out and replacing it with a new drive without taking the server offline is very appealing.

Indie Marketing @ Macworld

Macworld Expo San Francisco is one of the largest, if not the largest, Mac user event of the year. For an indie Mac developer, if there is one conference (other than WWDC) that should be attended, this is it. So why haven’t I attended in previous years? I asked myself that same question last year after hearing about all the indie get-togethers and bar nights.

The Good Ol’ Days

The last time I attended Macworld Expo was back in 2001, just after I graduated from UCSB and before starting a job in Tucson, AZ. A lot has changed in my career in these past 7 years. For one, 7 years ago I hadn’t yet developed any software for the Mac platform. Though I was an avid Mac user, at that time I was programming mostly for Unix, and occasionally on Windows (against my will).

But that was years ago…I started programming for Mac OS X in 2002, so the question remains, why haven’t I been attending Macworld? I think it may have something to do with the conditions of which I have attended Macworld in the past. You see, the first year I attended Macworld Expo was back in 1990. The Mac IIfx was the big new machine at the time, and with the costs of such a machine nearing $10,000, only a few companies had that kind of hardware at their booth. Mac IIcx and IIci’s were more common, as was the Mac Portable—which was new at the time. I attended Macworld every year after that until 1997, when it didn’t make sense to take time off from classes at UCSB to do so. To me, attending the expo was a fun event; almost like going to an amusement park. Yeah…I was most definitely a Mac geek.

Perspective

The thing is, I never saw Macworld as a business event…it was strictly for fun. And now that I’m living in Michigan, it didn’t make sense to spend the money to attend a “fun” event. It wasn’t until I started talking to other developers who had attended the conference that I realized just how much I was missing by not attending.

Will I have a booth? No. How about one of those ADC developer kiosks? Nope. Why not? Well, this year I just want to re-learn the ropes of the conference. Paul Kafasis has written a nice series of articles on exhibiting at Macworld, but it’s been such a long time, I really want to get a recent perspective on what the conference is like before plunking down $10k to become an exhibitor. So this year, Gaucho Software will be at Macworld as a Quasi-Exhibitor.

What does this mean? Well, it means that I’ll have a lot of similar materials as a company exhibiting would, except for the actual 10×10 foot real-estate on the show floor. First, I designed a different Seasonality t-shirt for each day on the show floor and had Zazzle print them up. Second, I designed a flyer and ordered 1000 copies from SharpDots. Finally, I put in an order through PensXpress for 200 Seasonality pens to give away at the show. Let me elaborate a bit to explain my reasoning for each of these…

1. T-Shirts

I started designing and ordering the first Gaucho Software T-Shirts about 18 months ago for WWDC 2006. Thanks to outfits like Zazzle and CafePress, it’s now easy to print a custom design on a t-shirt of pretty much any color and style. At the time, I just threw the words Gaucho Software across the front and a big logo across the back. It was beneficial to wear at developer conferences like WWDC and C4, because it would give people a better idea of who I was before actually meeting them. Did it increase sales? No…but that’s okay, it was cool to have the shirts all the same.

For WWDC 2007, I designed a t-shirt highlighting Seasonality and I wore it on a day when there was an event at the SF Apple Store in the evening. Surprisingly enough, I found a nice little spike in sales during the day or two after wearing that shirt. Hey, if that one t-shirt helped sales, wearing a different Seasonality shirt each day of Macworld should help too…

2. Flyers

The decision to design a flyer to hand out at the show was easy, but going through the details of actually designing it was much more difficult. First, I had to choose a size. I decided to go with a half-sheet, or 8.5×5.5 inches. I chose this size because I didn’t want the flyer to get lost in the shuffle. I remember getting tens, even hundreds, of flyers every day I attended in previous years. A full page flyer would require a lot of content, and would be more difficult to hand out to people. Going with a size that is as wide as normal page but not as tall, will keep it from getting lost, but still make it easier to hand out.

The design was a bit tricky. I’m used to designing interfaces on-screen using the RGB colorspace. Designing for print is different. First, you have to deal with color limitation in the CMYK colorspace. Seasonality uses a lot of blues…which CMYK wreaked havoc upon. I had to choose a screenshot carefully to make sure it still looked good. Next, I had to deal with the print design being a fixed entity. Application (and to some extent, web) interfaces are dynamic. I needed to find a good way to portray information in a non-changing medium. Finally, I needed to make sure all the necessary information was on the flyer somewhere. I was pretty close to printing a design without any kind of URL to note where to purchase Seasonality. Incredible, yes… That would have made the flyers next to useless. I spent hours designing the flyer, and it took a second viewer only a few minutes to notice the lack of any kind of link. Moral of the story is, have someone check your work before shipping it off to print.

3. Pens

The pens I ordered was a last-minute idea that I think will be pretty cool. Macworld exhibitors usually give away some kind of trinket, and I thought it would be cool to do the same. Most trinkets aren’t often used after the conference ends, and I didn’t want to give someone a trinket they would just end up throwing out afterwards. A pen will hopefully remain useful for most attendees after the conference ends.

Another thing I didn’t want to do was skimp out, so I decided to go for a metal casing instead of plastic. Of course, some plastic pens are very nice, but you can’t tell that by looking at a picture on a website. I figured with a metal pen, it would at least have a decent weight and feel to it. At the same time, I didn’t want a pen that was too expensive either. There’s no way I would get enough sales to cover the costs of handing out pens at $10+ a piece. I ended up finding a nice metallic pen with laser engraving for $1 each at PensXpress. Their turn-around time was pretty quick, and I’m pleased with the results.

4. Profit?

After all this work, I’m not exactly sure what to expect at this point. Obviously, I hope I make enough in sales to pay for all of these materials and my trip costs, but it’s not so much the money I’m looking for here. What I would really like is increased mind-share. Thus far, all of my marketing has been directed towards Mac users who frequent news and download websites. There are certainly a lot of users who fit into this category, but what about users who don’t spend their free time online? I’m hoping to meet a lot of these other users at Macworld, and hopefully it will give me a chance to widen Seasonality’s audience.

If you’re planning to attend Macworld, be sure to look for the guy in the Seasonality shirt and stop to say hello… 🙂

Using Compressed Textures in OpenGL

I’m not sure if it’s just me, but for some reason OpenGL coding involves a lot of trial and error before getting a feature such as lighting, blending, or texture mapping to work correctly. The past few days I have been working on adding texture compression to my OpenGL map test project. Ultimately, this code will be merged with the rest of the Seasonality source tree, and it’s going to look pretty cool.

Most OpenGL developers will use regular images and possibly compress them when loading them as a texture on the GPU. This is fairly straightforward, and just involves changing one line of code when loading the texture. Note that this is a huge gain when it comes to graphics memory savings, as I was using about 128MB of VRAM when texture compression was disabled and only around 30MB with compression enabled. I wanted to accomplish something a bit more difficult though. I’m going to be using several thousand textures, so I would like to have OpenGL compress them the first time Seasonality is launched, and then save the compressed images back to disk so concurrent launches will not require the re-compression of the imagery.

The problem I ran into was not enough developers are using this technique to speed up their application, so sample code was scarce. I found some in a book I bought awhile back called “More OpenGL Game Programming,” but the code was written for Windows, and it didn’t work on Mac OS X. So I dove deep into the OpenGL API reference and hacked my way through it. The resulting code is a simplification of the method I’m using. It should integrate with your OpenGL application, but I can’t guaranty this completely because it is excerpted from my project. If you’re having a problem integrating it though, post a comment or send me an email.

First, we have some code that will check for a compressed texture file on disk. If the compressed file doesn’t exist, then we are being launched for the first time and should create a compressed texture file.

- (bool) setupGLImageName:(NSString *)imageName
         toTextureNumber:(unsigned int)textureNumber
{
   GLint width, height, size;
   GLenum compressedFormat;
   GLubyte *pData = NULL;

   // Attempt to load the compressed texture data.
   if (pData = LoadCompressedImage("/path/to/compressed/image", &width, &height,
       &compressedFormat, &size))
   {
      // Compressed texture was found, image bytes are in pData.
      // Bind to this texture number.
      glBindTexture(GL_TEXTURE_2D, textureNumber);

      // Define how to scale the texture.
      glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
      glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

      // Create the texture from the compressed bytes.
      glCompressedTexImage2D(GL_TEXTURE_2D, 0, compressedFormat,
                             width, height, 0, size, pData);


      // Define your texture edge handling, here I'm clamping.
      glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_R, GL_CLAMP);
      glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
      glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
      // Free the buffer (allocated in LoadCompressedImage)
      free(pData);
      return YES;
   }
   else {
      // A compressed texture doesn't exist yet, run the standard texture code.
      NSImage *baseImage = [NSImage imageNamed:imageName];
      return [self setupGLImage:baseImage toTextureNumber:textureNumber];
   }
}

Next is the code to load a standard texture. Here we get the bitmap image rep and compress the texture to the GPU. Next we’ll grab the compressed texture and write it to disk.

- (bool) setupGLImage:(NSImage *)image
         toTextureNumber:(unsigned int)textureNumber
{
   NSData *imageData = [image TIFFRepresentation];
   NSBitmapImageRep *rep = [[NSBitmapImageRep alloc] initWithData:imageData];
   // Add your own error checking here.

   NSSize size = [rep size];
   // Again, more error checking.  Here we aren't using
   // MIPMAPs, so make sure your dimensions are a power of 2.

   int bpp = [rep bitsPerPixel];

   // Bind to the texture number.
   glBindTexture(GL_TEXTURE_2D, textureNumber);
   glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

   // Define how to scale the texture.
   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

   // Figure out what our image format is (alpha?)
   GLenum format, internalFormat;
   if (bpp == 24) {
      format = GL_RGB;
      internalFormat = GL_COMPRESSED_RGB_S3TC_DXT1_EXT;
   }
   else if (bpp == 32) {
      format = GL_RGBA;
      internalFormat = GL_COMPRESSED_RGBA_S3TC_DXT5_EXT;
   }

   // Read in and compress the texture.
   glTexImage2D(GL_TEXTURE_2D, 0, internalFormat,
                size.width, size.height, 0,
                format, GL_UNSIGNED_BYTE, [rep bitmapData]);

   // If our compressed size is reasonable, write the compressed image to disk.
   GLint compressedSize;
   glGetTexLevelParameteriv(GL_TEXTURE_2D, 0,
                            GL_TEXTURE_COMPRESSED_IMAGE_SIZE,
                            &compressedSize);
   if ((compressedSize > 0) && (compressedSize < 100000000)) {
      // Allocate a buffer to read back the compressed texture.
      GLubyte *compressedBytes = malloc(sizeof(GLubyte) * compressedSize);

      // Read back the compressed texture.
      glGetCompressedTexImage(GL_TEXTURE_2D, 0, compressedBytes);

      // Save the texture to a file.
      SaveCompressedImage("/path/to/compressed/image", size.width, size.height,
                          internalFormat, compressedSize, compressedBytes);

      // Free our buffer.
      free(compressedBytes);
   }

   // Define your texture edge handling, again here I'm clamping.
   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_R, GL_CLAMP);
   glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
   glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

   // Release the bitmap image rep.
   [rep release];

   return YES;
}

Finally we have a few functions to write the file to disk and read it from the disk. These functions were pulled almost verbatim from the OpenGL book. In the first code block above we called LoadCompressedImage to read the texture data from the disk. In the second code block, we called SaveCompressedImage to save the texture to disk. Nothing really special is going on here. We write some parameters to the head of the file, so when we go to read it back in we have the details. Bytes 0-3 of the file are the image width, 4-7 is the image height, 8-11 is the format (GL_COMPRESSED_RGB_S3TC_DXT1_EXT or GL_COMPRESSED_RGBA_S3TC_DXT5_EXT), 12-15 is the size of the image data in bytes, and bytes 16+ are the image data.

void SaveCompressedImage(const char *path, GLint width, GLint height,
                         GLenum compressedFormat, GLint size, GLubyte *pData)
{
   FILE *pFile = fopen(path, "wb");
   if (!pFile)
      return;

   GLuint info[4];

   info[0] = width;
   info[1] = height;
   info[2] = compressedFormat;
   info[3] = size;

   fwrite(info, 4, 4, pFile);
   fwrite(pData, size, 1, pFile);
   fclose(pFile);
}

GLubyte * LoadCompressedImage(const char *path, GLint *width, GLint *height,
                              GLenum *compressedFormat, GLint *size)
{
   FILE *pFile = fopen(path, "rb");
   if (!pFile)
      return 0;
   GLuint info[4];

   fread(info, 4, 4, pFile);
   *width = info[0];
   *height = info[1];
   *compressedFormat = info[2];
   *size = info[3];

   GLubyte *pData = malloc(*size);
   fread(pData, *size, 1, pFile);
   fclose(pFile);
   return pData;
   // Free pData when done...
}

Hopefully this will save someone development time in the future. If you catch any errors, let me know.

A Weather Developer's Journey Begins!

A few months ago an idea came to me for a new domain name I should pick up. I wasn’t sure what I was going to do with the domain yet, or if I would even use it at all, but I wanted to jump on it because it was available. It struck me as a good name, and I was a bit surprised that it was still available.

Zoom forward to a couple of weeks ago, when a different idea for a new website came to mind. Oftentimes I post detailed entries here specifically about the weather or very weather-tech related articles when I run into issues while developing Seasonality. The thing is, I’m not sure my typical reader is interested in these postings, and occasionally I will refrain from posting about weather-related issues simply because I don’t think it would fit well in the *Coder blog.

But wait a second…I bought that domain awhile back, maybe that would work. Actually, it ended up being perfect for my new site idea. So I started working on it off and on. I’ve been happily using WordPress to host this blog for quite some time, so I decided to use the same platform for the new site. I set things up, customized a theme, and wrote a posting or two. I think now it is finally ready to be revealed, and I wanted to share. The domain is weatherdeveloper.com, and the site is called “A Weather Developer’s Journey.” With the full-time development of Seasonality 2 coming soon here, I thought it would be a perfect time to start a site like this, as I’ll be spending a lot of time trying to overcome issues with data hosting, manipulation, and weather visualization.

If you’re at all interested in checking it out, please do so (Website, RSS Feed). I’ll most likely post links from here to the first few articles on the new site, just to get the ball rolling. The first article talks about finding and putting to use 90 meter elevation data.

Summer Update

It’s been almost two months since I’ve posted here, so to avoid the risk of this blog becoming a dinosaur, I thought I would post an update.

Katrina and I returned from our 6-7 week road trip in the beginning of July. We drove out to California at the end of May, and stayed with family for several weeks (also hitting WWDC, of course). On the way out there, we took the northern route, hitting Mt. Rushmore, Yellowstone, the Tetons, and the Salt Flats. On the way back, we started in Santa Monica and drove Route 66 all the way through St. Louis, taking the freeway the rest of the way home after running out of time. We’ll have to drive the rest of Route 66 from St. Louis to Chicago sometime soon. Overall, it was quite a trip. Watch my Flickr stream for photos of the trip.

Been working on finishing up DynDNS Updater 2.0, which will hopefully be ready soon. The app is looking pretty good. A lot of smaller details have been improved upon since beta 5, that collectively improve the application quite a bit.

Trying to spend some time working on Seasonality’s international forecast as well. I’ll post more on this at a later point in time, but I’ve created some cool imagery and animations that I’ll be using to tweak the forecast generator to make it more accurate.

C4 is coming up this weekend! I’ll be taking off for Chicago tomorrow for a weekend of Indie fun. I’ll be showing an entry for the Iron Coder Live contest…which reminds me I still need to fix a bug or two there. Should be a blast. I’ll most likely be keeping my Twitter feed up to date more than posting here about stuff.

Speaking of Twitter, I started Twittering (is that a word?) a few months ago, and I’m hooked. If you don’t know, Twitter is a place to post Tweets, which are short bits of text (no longer than 160 characters), usually telling others what you’re up to. My first thought was how much time I would be wasting by doing this, but the whole idea is that posting a Tweet is supposed to be really quick. It provides some nice breaks throughout the day, and the community building around the site is pretty amazing. Check out my Twitter page, and if you’re interested, sign up and start using Twitter yourself.

That’s all folks…

DynDNS Updater 2.0 Public Beta

This just went live in the last 24 hours. Along with working on my own Gaucho Software products such as Seasonality, Dash Monitors, and XRG, I also consult for other firms. I’ve been working with Jeremy and the rest of the DynDNS team on their Mac update client for quite some time. Just recently, version 2.0 of the project has reached a stage where it’s ready for public beta consumption. The interface, developed by FJ de Kermadec and his team at Webstellung is top notch, and does a great job hiding the complexity of everything going on behind the scenes. We still have some work yet to do before a final 2.0 release is ready, but I think the app is looking pretty good thus far.

If you use DynDNS services, check it out. If you aren’t using DynDNS services, give their website a look-see to determine if you should be using their services. 🙂

Server Colocation

I’m happy to say that after spending the past 5 months in the pipeline, Rio (the Gaucho Software Forecast Server) has now been moved to a colocation facility. The facility provides redundant power, cooling, and several connections to Tier 1 network providers, which should definitely increase the server’s overall reliability. Previously, this server was located at my office. I had it hooked up to a UPS battery backup that gave it about 30 minutes of runtime, but it’s a far cry from true redundant power. Also, over the past several months, it seems that my business network connection reliability has been slowly decreasing. This should fix that issue.

Rio in the rack...

Before moving the server, I thought it would be a good idea to add a bit more memory and hard drive space to the box. I bumped up the memory to the motherboard’s max of 4Gb, which gives some more breathing room for the virtual machines I’m running via VMware. I also added another 300GB hard drive and switched from a 3 disk RAID 5 configuration to a 4 disk RAID 10. I had been reading on PostgreSQL mailing lists that for configurations with less than 6-8 hard drives, RAID 10 was substantially faster than RAID 5. RAID 5 has always been infamously slow at writes, but the read speeds are pretty good in general, so I had my doubts. Well, my doubts were definitely unfounded, because this single hard drive upgrade has given a dramatic performance increase. Previously, when running the forecast generator (twice a day), the processing would take approximately 2 hours. Now, after adding the 4th disk and switching to RAID 10 using an XFS filesystem (more on this below), the same process takes only 1 hour and 10 minutes.

Rio inside...

Since I was starting with a fresh RAID partition, I thought I should put some time into looking at different Linux filesystems. I used Bonnie++ to perform the disk benchmarks using a temp file size of 16GB. Initially with a standard Ext3 filesystem:

Version  1.03       ------Sequential Output------ --Sequential Input- --Random-
                    -Per Chr- --Block-- -Rewrite- -Per Chr- --Block-- --Seeks--
Machine        Size K/sec %CP K/sec %CP K/sec %CP K/sec %CP K/sec %CP  /sec %CP
rio.gauchoso 16000M 50719  83 103617  41 43095  10 51984  76 117808  13 285.4   1
                    ------Sequential Create------ --------Random Create--------
                    -Create-- --Read--- -Delete-- -Create-- --Read--- -Delete--
              files  /sec %CP  /sec %CP  /sec %CP  /sec %CP  /sec %CP  /sec %CP
                 16  2730  98 +++++ +++ +++++ +++  3024  99 +++++ +++  9129  99

Not bad…just over 100MB/sec writes and 117MB/sec reads. Notice the random seeks/sec value of 285.4/sec and file creates of 2730/sec. On a database system, the disk heads are given quite a workout, reading from indexes and data tables all over the disk. So seeks/sec performance was important to me. Memory cache helps, but my database is around 30GB on disk, so caching only goes so far. Overall, the ext3 numbers sounded pretty good, but I didn’t have anything to really compare them to.

I decided to try out SGI’s XFS filesystem. After seeing several benchmarks online between filesystems like ReiserFS, JFS, Ext3, and XFS, it seemed that XFS almost always had the best performance in general, so I gave it a go. XFS has a lot of nice features, including tools to defragment a partition while the system is active, the use of B+ trees for volume indexes (resulting in a much greater efficiency when you have a ton of files in a single directory), and a pretty decent tool to repair broken filesystems as well. I reformatted the partition and ran Bonnie++ again with these results:

Version  1.03       ------Sequential Output------ --Sequential Input- --Random-
                    -Per Chr- --Block-- -Rewrite- -Per Chr- --Block-- --Seeks--
Machine        Size K/sec %CP K/sec %CP K/sec %CP K/sec %CP K/sec %CP  /sec %CP
rio.gauchoso 16000M 55354  79 126942  21 32567   6 47537  70 126927  14 415.8   1
                    ------Sequential Create------ --------Random Create--------
                    -Create-- --Read--- -Delete-- -Create-- --Read--- -Delete--
              files  /sec %CP  /sec %CP  /sec %CP  /sec %CP  /sec %CP  /sec %CP
                 16  3917   2 +++++ +++  2662   3  3102   5 +++++ +++  2612   3

Write speed increased a healthy 22% to almost 127MB/sec, and read speed increased almost 8% to a similar 127MB/sec rate. But look closer at the seeks/sec and creates/sec rates… Seeks/sec increased an incredible 45% to 415.8/sec, and file creates improved 43% to 3917/sec. The drawback? Deleting files is quite a bit slower, 71% slower to be exact. To me, this tradeoff was well worth the gains, as it’s fairly rare for me to be deleting lots of files on the server. I have noticed a slight performance degradation while using rm with a lot of files, but it’s still a RAID so performance is certainly acceptable.

It’s not good to use just a single benchmarking tool, so I checked the validity of a couple of Bonnie++’s claims with a few simple dd commands, this time using 8GB file sizes:

root@rio:/rio# dd if=/dev/zero of=/rio/bench bs=8192 count=1000000
1000000+0 records in
1000000+0 records out
8192000000 bytes (8.2 GB) copied, 63.7116 seconds, 129 MB/s
root@rio:/rio# dd if=/rio/bench of=/dev/null bs=8192 count=1000000
1000000+0 records in
1000000+0 records out
8192000000 bytes (8.2 GB) copied, 59.2728 seconds, 138 MB/s

Those match up pretty well on the writes, and reading from the disk sequentially with dd is even faster than Bonnie++ claims.

Overall, I’m pretty pleased with the upgrade. I’m even happier to have this server at a datacenter. This should give me a lot of room to grow with hosting more weather data for upcoming Seasonality features, and also gives me a good server to run some other services on as well.

Happy Birthday Seasonality!

Seasonality is two years old today. Sometimes when I look back on progress made as several releases have past, it can be surprising just how much it has changed. I thought this might be a good time to revisit Seasonality’s development path and maybe talk about where Seasonality is going from here. This is a long posting, so go grab one of your favorite beverages, kick back, and enjoy the ride.

Most people find it surprising that the idea for Seasonality didn’t start as an application at all, but instead as a screensaver. The screensaver wasn’t even going to focus on the weather. I wanted to design something to show weather, news, stock info, and other information. I called it gKiosk. The g came from Gaucho Software (I thought I would produce a g-line of products instead of the i-line that Apple chose), and Kiosk was the best term I could think of to describe all this information together. Since I already had some code from XRG to generate weather graphs, I started porting that first and working on the weather section. After a few weeks of working on the project, I began to realize just how involved it would be to even produce just the weather section of gKiosk.

I’ve always had a casual interest in the weather. I can remember being stoked after getting a cheap plastic weather station kit when I was a kid. The instruments were useless for taking accurate measurements, but they were fun to play with. My childhood interest in meteorology didn’t develop much past that, but a general curiosity remained. With this casual interest, and some half-written code I decided to cut the news and stock portions out of the project and chose to develop a dedicated weather screensaver.

Shortly thereafter, I came to another fork in the road. I wanted to develop a product that was going to sell, after all I was working on Gaucho Software full time, and without sales I was without an income. Users will buy programs they find useful, but no-one really “uses” a screensaver. Screensavers start when a user walks away from their computer, and as soon as they return, the screensaver is brushed aside so some real work can be accomplished. I didn’t want to develop a program that would be brushed aside, so at this point, I took the screensaver view I had worked on and shoehorned it into an NSWindow. Gaucho Software’s first new program was going to be a weather application, and it was going to be called Seasonality.

Three to four months later I had a basic concept working. I wanted to get some feedback from friends at this point to see if I was on the right track, so on February 7, 2005 I sent an email out with a Seasonality 0.1 release. Here’s part of that email:

Well, it's time to release version 0.1 of another product. This is the first pre-release version of a new weather application called Seasonality. I've been working on this application on and off for the past 6 months, resulting in just over 10k lines of code (almost as many as XRG with over 2.5 years of off-hours development time). This is probably already assumed, but please don't pass this link along to other people. If you know of someone else who you think would be interested in testing out this app, let me know.

The purpose of this release is to get some feedback from a select group of people about the design and implementation of the application itself, and I would appreciate any feedback you could give me. Questions to think about: Does the app have the basic features that you would be looking for in a weather application? Does the work-flow of the app make sense (method of adding and removing locations, changing the display, prefs, etc)? Is there anything that blatantly stands out to you as not making sense in the design? Is there anything missing (besides the obvious lack of international support)? :-) Basically, I'm looking for a sanity check before adding the final features and start beta testing for a public release.

Feedback at this stage was highly beneficial, just to make sure I was on the right track while finishing development. Here’s a screenshot from just before that release that I’ve never posted before. The interface was pretty embarrassing looking back on it, but I was able to clean it up before 1.0. 🙂


Pre-Seasonality 0.1

While the general layout is the same, I can’t really recognize any part of this window in the current release. Fortunately, it seemed that even though the application was very rough around the edges, the concept was good; it just needed polish, a lot of polish. One thing that I’m sure many users are thankful for though is the color theme switching from a “sunny” yellow to a blue, which looks a whole lot nicer next to other OS X applications.

A lot changed in the 6 weeks after that. I re-worked the forecast to a more compact display with a stacked day/night view, informational text was added in the location table (along with a slicker interface for adding/removing locations), radar images turned into something that I actually generated instead of displaying an image off a website, and graphs actually became something that could be customized a bit. The 1.0 beta was pretty intense, and fortunately Katrina was putting in long hours at the office so I could do the same.

I don’t remember much about March 21st, 2005, but I do remember not sleeping much the night before. It wasn’t that I was cramming all the last minute stuff in, it was more of an anxiety I had. What if it didn’t work on other Macs? What if the web server went offline? What if the licensing didn’t work? What if no one bought it? Luckily, it did work on other Macs, the web server was fine, licensing worked (except for a small detail–I left a serial number in a textbox in the nib file…didn’t take me long to fix that one :-)), and people did buy it. Overall, it was a pretty good release. I posted a preview a few days before release here on my blog. Here’s a screenshot of Seasonality 1.0 in action…


Seasonality 1.0

Naturally, like the 1.0 release of any piece of Mac software, the support and feature requests started pouring in. It was great to hear from everyone who was giving Seasonality a try, and a lot of feature ideas I received at this time have made it into later versions (and some are still on the list of features I’m working on!). At the time there were several ideas I had for future releases, the most notable feature absence was mentioned in the email I sent out for 0.1 above: Seasonality didn’t support international locations. So I went back to cranking out code and 7 months later I emerged with Seasonality 1.1.


Seasonality 1.1

Seasonality 1.1 had a real interactive satellite map…you could drag to navigate all over the world. The forecast changed to it’s current zigzag layout to make it easier to interpret, and the cylindrical wind direction graph was born. I still don’t think there’s any graph like it being used in other weather software, and I’m glad no one has copied the idea yet.

While there have been a lot of new features added since the Seasonality 1.1 release, my programming mode kind of switched at that point. The interface had settled down a bit, and the basic featureset was looking good. I started focusing on evolving the application and molding it into what I ultimately wanted version 1.0 to be. Seasonality 1.2 contained a lot of code cleanup, a ton of bug fixes, and some interface polish.

The polish and cleanup was a big change, but by far the biggest update in Seasonality 1.2 was French language support. I had wanted to support multiple languages for quite some time, but hadn’t a clue how to approach it. After going to WWDC that year and learning about OS X’s localization support, it didn’t look too difficult. Apple’s presentations at WWDC always seem to make things look easier than they are though. 🙂 I heavily underestimated the amount of work required. It took weeks, if not months, of trying to catch all of the different text phrases in the app. Not only did all the strings need a French equivalent, but spacing needed to be changed in cases where the French string was longer than the English one. Then came the help translation, which was an ordeal in and of itself with text encodings and other issues. Fortunately, Ronald and Philippe were dedicated as translators to getting everything done quickly. There were some days I couldn’t keep up with all the changes and updates they were sending over. In the end, I think users really appreciated the change, not only in France, but other French speaking countries such as Canada. Ever since, I’ve been using online translation websites fairly frequently to handle support requests.


Seasonality 1.2

Seasonality 1.3 looked a lot like Seasonality 1.2, with the exception of the new graph handling that allowed more than 2 graphs to be shown at one time. Journalling was the next big thing here. Seasonality finally had a 2nd window! Shortly after Seasonality 1.0 was released, I received a feature request from a gardener who wanted to keep track of and take notes of the current conditions. This was my first hint that Seasonality could be used for anything other than casual weather observation. Since then I’ve received emails from pilots, sailors, home automation specialists among other fields. It’s very satisfying when someone tells me they use Seasonality in a way I haven’t thought of.


Seasonality 1.3

Progress was going great, until August of 2006 when international support broke. Not just “broken” in the respect of being able to fix it with a couple hours of coding, but really broke. In an application dependent on web services, there’s always going to be times when an XML-RPC API will change, or maybe a web site’s layout will foul up your parser a bit. However, in this case Environment Canada pulled their international forecasts entirely. It had taken quite some time to find an international forecast source the first time around; it was even tougher to find a replacement. It took me a week or two just to find a new source, but that source was direct model output data, which is beyond any raw data I had ever worked with. This was the first time I couldn’t just display some weather data, I had to interpret what the model was predicting and programmatically generate forecasts, and then verify that the forecasts were close to what other weather providers were forecasting. It took me over a month just to write code to interpret the data and optimize the web server.

The silver lining to this cloud is the added flexibility I have from this point forward. In Seasonality 1.4 I was able to allow users to add their own custom locations because I was no longer limited to just the locations Environment Canada chose to forecast. A lot of other cool features such as wind forecasts will be coming out of this in the future. When I was finally able to recover and focus on Seasonality 1.4 development, I added moon information, which makes the sundial look a lot cooler, and a ton of bug fixes and code optimizations…


Seasonality 1.4

So where does that leave us? Today I’m happy to announce a new release, Seasonality 1.4.1. There were some remaining bugs in Seasonality 1.4 that just needed to be wrapped up. Especially since development of new features for Seasonality 1.x is nearing the end…

From this point on I’ll be working on Seasonality 2.0. I tagged Seasonality as “weather taken to the next level.” Now it’s time to explore the level after “next.” I’m not ready to announce any new features yet, but I can say the current list is looking pretty hot (no pun intended). I’ve been mulling over a fresh interface design in my head for the past several months–an interface design that will not only improve the display of data that Seasonality currently shows, but also make room for some really cool new feature ideas I have for Seasonality 2.1+ releases… If you’re thinking of buying Seasonality today, don’t hesitate. Seasonality 2.0 is still a ways off, and everyone who purchases Seasonality now will get a free upgrade to Seasonality 2.0.

If you’re still reading at this point, all I can say is “wow,” you have a lot of patience. 🙂 I’d like to send a thanks out to all the Seasonality users and beta testers… The past 2 years have been a blast and I can only hope for the same next year as well.

The Cult of Seasonality

I’d like to send a big thanks out to Scott McGrath for his awesome review of Seasonality on the Maccast

Seasonality is a gorgeous OS X application, a distinctive Cocoa app that makes people stop, look and ask questions when they walk by a display showing its exotic weather maps, satellite imagery, forecast data, and charts of recent weather in your region.

MacSanta

In the spirit of the Holidays, some developers are getting together and offering 20% discount coupons on their software. If you have been considering the purchase of Seasonality, there’s never been a better time to buy. Just use the coupon code “MACSANTA” when checking out of the Gaucho Software Store, and get 20% off the purchase of Seasonality.

Thanks go out to the Rogue Amoeba crew for putting this together.

« Older posts Newer posts »

© 2020 *Coder Blog

Theme by Anders NorenUp ↑