*Coder Blog

Life, Technology, and Meteorology

Category: Weather (page 1 of 2)

Flooding Moby Dick

This weekend, a pretty heavy storm hit the California coast. One city hit particularly hard was Santa Barbara, where two restaurants at different beaches several miles apart were flooded by waves. Luckily, there were only minor injuries. The event caught me by surprise because of the coastal layout of that region.  You see, the Santa Barbara coast in general faces south.  So you don’t get a whole lot of big waves hitting the region.  That makes an event like this especially rare.  Even during the El Nino year of 1997-98, when strong storms battered the coast all winter, we never saw anything quite like this.

The most surprising incident of the two was the Moby Dick restaurant at Sterns Wharf. Here’s a frame from a YouTube video taken by someone in the restaurant as it hit.  Click the image to see the full video on YouTube.

And a news article talking about what happened at (KEYT.com).

The interesting thing about this destruction is where it happened. Sterns Wharf is actually on a beach facing southeast, so for swells coming in from the Pacific along the west to be strong enough to wrap around the coast and strike a beach facing southeast this hard is quite astounding.

Let’s take a look at the swell map from that morning to see what was actually happening. CDIP has a nice view of the swell state that morning:

Swell Map from CDIP

I’ve marked Sterns Wharf on that map. As you can see, the swell was coming from directly west, which is just about the worse possible case. Any northwestern component to a swell would force it to not only wrap around the peninsula in Santa Barbara but also around Point Conception. Any southwestern component to the swell would result in the Channel Islands blocking Santa Barbara from getting hit. A swell coming from exactly west can slot right through to the Santa Barbara area, perhaps even resulting in a higher tide because of the channeling of the water between the coast and the islands off shore.

You may think a westerly swell direction would be normal, but usually the swell in this area of California comes from the northwest.  This is due to the strongest winds of storms like these typically being further north, off the coast of Oregon and Washington.

From the video, it sounds like this event happened at an abnormally high tide of 6 feet (high tides are usually between 3-5 feet), and a 12 foot swell was actually reaching the coast in downtown Santa Barbara. Whenever you have a combined effect of high tide and high swells like this, disaster is sure to follow.

Hopefully Moby Dick can get things cleaned up there before too long. There are definitely a few restaurant patrons who will have a story to tell for quite some time.

Creating Seasonality Map Tiles

In a weather app, maps are important. So important, that as a developer of weather apps, I’ve learned far more than I ever care to know about topography. When I originally created the maps for Seasonality, I had to balance download size with resolution. If I bumped up the resolution too far, then the download size would be too big for users on slower internet connections. If I used too low of a resolution, the maps would look crappy. I ended up settling on 21600×10800 pixel terrain map, which after decent image compression resulted in Seasonality being a 16-17 MB download. At the time, most apps were around 5 MB or less, so Seasonality was definitely a more substantial download.

That compromise was pretty good back in 2005, but now that a half-decade has passed it is time to revisit the terrain I am including in the app. Creating a whole new terrain image set is a whole lot of work though, so I thought I would share what goes into the process here.

First, you have to find a good source of map data. For Seasonality, I’ve always liked the natural terrain look. The NASA Blue Marble imagery is beautiful, and free to use commercially, so that was an easy decision. For the original imagery I used the first generation Blue Marble imagery. Now I am using the Blue Marble Next Generation for even higher resolution.

Next you have to decide how you are going to tile the image. I’ve chosen a pretty simple tiling method, where individual tiles are 512×512 pixels, and zoom levels change by a power of 2. Square tiles are best for OpenGL rendering, and while a larger (1024, or even 2048 pixel) tile would work, 512×512 pixel tiles are faster to load into memory and if downloading over the network it will transfer faster as well. From there, you have to figure out how many tiles will be at each zoom level. I’ve chosen to use a 4×2 tile grid as a base, so the smallest image of the entire globe will be 2048 x 1024 pixels and made up of 8 tiles. As the user zooms in further, they will hit 4096×2048, 8192×4096, 16384×8192 pixel zoom levels and so on. I’ve decided to provide terrain all the way up to 65536×32768 pixels.

Now that you have an idea of what tiles need to be provided, you need to actually create the images. This is the most time consuming part of the process. Things to consider include the image format and compression amounts to use on all the tiles, and these are dependent on the type of display you are trying to generate. Creating all the tiles manually would take forever, so it’s best to automate this process.

The Blue Marble imagery comes in 8 tiles of 21600×21600 each (the full set of images for every month of the year is around 25 GB). I start by creating the biggest tile zoom level and moving down from there. For my 65536×32768 zoom level, I’ll resize each of the 8 tiles into 16384×16384 pixel images. I use a simple Automator action in Mac OS X to do this. I created an action that takes the selected files in the Finder and creates copies of the images and resizes the copies to the specified resolution.

Now that I have 8 tiles at the correct resolution, I need to create the 512×512 tiles for the final product. For Seasonality, I also need to draw all the country/state borders at this point, because otherwise the maps are blank. I created a custom Cocoa app that will read in a map image with specified latitude/longitude ranges, draw the boundaries, and write out the tiled images to a folder. My app has the restriction of only handling a single image at a time, I’ll have to drag each of the 8 tiles in separately for each zoom level. It’s not ideal, but I don’t do this too often either. For the 65536×32768 zoom level, I end up with 8192 individual tile images. Smaller zoom levels result in far fewer tiles, but you can see why automation is helpful here.

It’s a lot of work, but in the end the results are great. For Seasonality, along with higher resolution terrain, I’m also bringing in the Blue Marble’s monthly images. If everything goes as planned, Seasonality will show the “average” terrain for every month of the year. Users will be able to see the foliage change as well as the snow line move throughout the seasons.

Modeling a Storm

One fairly common project for a meteorology student to participate in after taking a few years of coursework is to do a case study poster presentation for a conference. With finishing up my synoptic scale course series this past spring, now would be a good time for me to work on a case study. What does a case study involve? Well, typically synoptic storms are fairly short-lived, lasting for 4-10 days. With a case study, you take a closer look at what was happening dynamically in the atmosphere during that storm, usually over a smaller region.

Picking a storm to look at for me was easy. Four years ago this October, I was visiting family in upstate New York and a very strong storm came through the region. Usually storms in October would drop rain, but this one was strong and cold enough to drop snow, and the results were disastrous. In Buffalo, 23 inches of snow fell in 36 hours. Buffalo is used to getting this much snow in the winter, but since the leaves hadn’t fallen off the trees yet, a lot more snow collected on all the branches. Thousands of tree limbs fell due to the extra weight, knocking out power for hundreds of thousands of people. Some homes didn’t have power restored for over a week. When I drove around town the next day, it was like a war zone, having to dodge tree branches and power lines even on main roads in the city.

So it was easy for me to pick this storm (I even wrote about it back then). Next we (I’m working with my professor and friend, Marty, on this project) needed to pick something about the storm to focus upon. I can’t just put a bunch of pictures up and say, “Hey, look at all the snow!” There has to be some content. For this case study, Marty thought it might be interesting to look at how different microphysical schemes would effect a forecast over that time period.

This was a really tough event to forecast. Meteorologists could tell it was going to be bad, but with the temperature just on the rain/snow boundary, it was difficult to figure out just how bad it would be and where it would hit the hardest. If temperatures were a couple degrees warmer and this event resulted in rain instead of snow, it would have been a bad storm, but there wouldn’t have been the same devastation as there was with snow.

Microphysical schemes dictate how a forecast model transitions water between states. A microphysics scheme would determine what physical traits would have to be present in the environment to result in water vapor condensing to form liquid and create clouds, freeze into ice, or collide with other ice/water/vapor to form snowflakes. Some schemes take more properties of the atmosphere and physics into account than others, or weight variables differently when calculating these state changes. If I look at which scheme did the best job forecasting this event, then meteorologists could possibly run a small model with that same scheme on the next storm before it hits, to give them a better forecast tool.

To test these schemes, I have to run a model multiple times (once with each scheme). To do that, I had to get a model installed on my computer. Models take a long time to run (NOAA has a few supercomputers for this purpose). I don’t have a supercomputer, but my desktop Mac Pro (8×2.26 Ghz Xeons, 12 GB RAM) is a pretty hefty machine that might just let me run the model in a reasonable amount of time. I’m using the WRF-ARW model with EMS tools, which is commonly used to model synoptic scale events in academia. This model will compile on Mac OS X, but after a week of hacking away at it, I still didn’t have any luck. I decided to install Linux on the Mac and run it there. First I tried Ubuntu on the bare metal. It worked, but it was surprisingly slow. Next I tried installing CentOS in VMware Fusion, and it was actually faster (20%) than Ubuntu on the bare machine. The only explanation for this I can think of is that the libraries the model is compiled against were built using better compiler optimizations in the CentOS distribution. So not only do I get a faster model run, but I also can use Mac OS X in the background while it’s running. Perfect.

Once the model is installed, I have to setup a control run using parameters generally used in the most popular forecast models. There are several decisions that have to be made at this stage. First, a good model domain needs to be specified. My domain covers a swath 1720×1330 kilometers over most of the Great Lakes area, centered just west of Buffalo. For this large of a storm, a 4 km grid spacing is a pretty good compromise between showing enough detail and not taking years for the model to run. For comparison, the National Weather Service uses a 12 km grid spacing over the whole US to run their NAM forecast model 4 times a day. To complete the area selection, we have to decide on how many vertical levels to use in the model. Weather doesn’t just happen at the earth’s surface, and here I set the model to look at 45 levels from the surface up through around 50,000 feet. (I say “around” here because in meteorology we look at pressure levels, not height specifically, and with constantly changing pressure in the atmosphere the height can vary. The top surface boundary the model uses is 100 millibars.)

In case you didn’t notice, this kind of domain is quite large in computing terms. There is a total of 5,676,000 grid points in 3 dimensions. When the model is running, it increments through time at 22 second intervals. The model will calculate what happens at each of those grid points in that 22 seconds, and then it starts all over again. Usually, the model only writes out data after every hour, and I think it’s pretty apparent why this is the case. If I configured the model to output all the data at every time, there would be more than 44 billion point forecasts saved for the 2 day forecast run. Each of these forecasts would tell what the weather would be like at a particular location in the domain at a particular time, and each forecast would have around 30-50 variables (like temperature, wind speed, vorticity, etc). If those variables were simple 32 bit floats, the model would output about 6 TB of data (yes, with a T) for a single run. Obviously this is far from reasonable, so we’ll stick to outputting data every hour which results in a 520MB data file each hour. Even though we are outputting a lot less data, the computer still has to process the 6 TB (and the hundreds of equations that derive that data), which is quite incredible if you think about it.

My Mac is executing the control run as I’m writing this. To give you an idea, it will take about 12 hours for the model run to finish with VMware configured to use 8 cores (the model doesn’t run as quickly when you use hyperthreading) and 6 GB of RAM. This leaves all the hyperthreading cores and 6 GB of RAM for me to do stuff on the rest of my Mac, and so far I don’t notice much of a slowdown at all which is great.

So what’s next? Well after getting a good control run, I have to go back and test and run the model again for each of the microphysics schemes (there are 5-7 of them) and then look through the data to see how the forecast changes with each scheme. I’m hoping that one of them will obviously result in a forecast that is very close to what happened in real life. After I have some results, I will write up the content to fill a poster and take it with me to the conference at the beginning of October. The conference is in Tucson, which is great because I will have a chance to see some old friends while I’m there.

What does this mean for Seasonality users? Well, learning how to run this model could help me improve international Seasonality forecasts down the line. I could potentially select small areas around the globe to run more precise models once or twice a day. With the current forecast using 25-50km grid spacing, running a 12 km spacing would greatly improve forecast accuracy (bringing it much closer to the forecast accuracy shown at US locations). There are a lot of obstacles to overcome first. I would need to find a reasonably sized domain that wouldn’t bring down my server while running. Something that finishes in 2-3 hours might be reasonable to run each day in the early morning hours. This would be very long term, but it’s certainly something I would like to look into.

Overall it’s been a long process, but it’s been a lot of fun and I’m looking forward to not only sifting through the data, but actually attending my first meteorology conference a couple of months from now.

Traveling Nightmare

2008 shall now forever be known as the year of the shitty holiday travel episode. I’m writing this at 2:30am on December 24th while sitting at Chicago O’Hare airport, waiting for our flight to take off “tomorrow” morning at 8am. For some reason, our travel this year has been plagued by delays and abnormalities.

Most of this has been caused by the unseasonably snowy weather we’ve had this year. In Mount Pleasant, for instance, we’ve already had over 45 inches of snowfall this year. With our typical total seasonal average around 50-60 inches, we’re approaching that quickly with 2-3 winter months still to go. We’ve broken records for the most snow in December (around 35 inches so far). The previous record was 21 inches back in 1907…

This has made for a pretty miserable travel season. For us, it started late last week when we were driving to Rochester, NY. We were planning to leave Friday morning, but a storm was coming in, so we decided to get a head start on it Thursday night. We left at 5pm, and ended up near St. Catharines, Ontario around midnight, so we decided to stop for the night. Of course on Friday morning, we woke up to blizzard-like conditions and 4-6 inches of new snow already on the ground. We decided to tough it out, after all we only had 2 hours left to the drive, so how bad could it be? We have snow tires on the car, which make driving so much easier (usually). We spent 3 hours battling our way to Buffalo in near white-out conditions, where we decided to hold off at a family member’s house until the snow let up, passing who knows how many cars stuck in ditches along the way. Finally, after another 3 hours driving that evening, we made it to Rochester. A day and a half of traveling when it should have only taken us 8 hours.

Then, this past Sunday we were planning to get together with the family, but of course after the large storm passed, we were left with some pretty strong lake-effect snow. Some areas around Buffalo got another foot of snow that day, and it was bad enough to cancel our family party for that evening. Bummer…after spending all that time driving to NY, a good part of the reason we went didn’t end up happening at all…

Monday, we drove from Rochester back to Grand Rapids, MI. Driving this day was pretty good in general, though it was snowing pretty heavily by the time we got to Grand Rapids. We stayed the night in a hotel, with our flight to California leaving the next afternoon (Tuesday, the 23rd). Tuesday morning we woke up to another 6 inches of snow. We ran some errands that morning, and between the snow and a busy Christmas shopping day, it took us about an hour to drive 5 miles in town. We made it to the airport by around 3:30pm, without seeing any break in the snow at all during the day we were there.

Of course, with our flight going through O’Hare, I kind of expected that we might be setting ourselves up for a long travel day. Our 4:45 flight from Grand Rapids to Chicago didn’t end up leaving until 7:45pm. Our connecting flight to San Francisco was delayed until 9:15pm, and with the time change we thought we had a good chance to make it. Sure enough, we landed in Chicago at 8:10, plenty of time to make our connection, until American Airlines pulled some stupid shit…

First, we couldn’t park at our gate because there was already another plane there. Then we proceeded to taxi for 45 minutes before getting to our new gate. To top it all off, the walkway on the new gate wasn’t working, so we had to wait until they figured out WTF was going wrong. Finally off the plane, I ran over to the next departure gate (it was only 3 gates away) to find our connecting plane sitting there being de-iced. Except nobody was at the gate, and the doors were locked. I ran over to the next gate to see if they would re-open the doors (there were 10 of us making this same connection). Seems simple…finish de-icing the plane, open the door, let 10 of us on the flight, close the door back up and let’s go. Nope…they wouldn’t do it.

Now rebooking a flight shouldn’t be a problem, but with all the cancellations today, everyone is trying to rebook. We ended up waiting in line for 2 hours, but finally got a connection on United for tomorrow morning at 8am. Walking to the other terminal took us around a half hour, and then we had to wait for another hour in line at a United counter to get our new tickets issued, only to find out that American booked the tickets without actually reissuing them to United, so while our seats were reserved, United couldn’t actually give us tickets for them. A call to American and another hour waiting in line finally resulted in some tickets on the next flight out tomorrow morning.

I suppose we should consider ourselves lucky…I can’t count how many people we’ve run into that have been delayed until the 26th; had to call off their trip completely; or even worse, made it halfway to their destination only to have to turn around a fly back home. Now as long as that flight at 8am doesn’t get cancelled…


Wow, I think this is the first time I’ve opened MarsEdit in months. Looks like my last post was back in February, so I figure an update here is long overdue. I don’t have any particular topic to talk about today, so this post will be a catchup of everything happening here in the past 3 months.

The biggest change has been a new consulting gig I picked up back in March. Clint posted on Twitter about a contract position for an iPhone developer on the Ars Technica Job Board. The kicker is that the job was to code a weather application. I had been curious about iPhone coding, but didn’t have time in my development schedule to fit another pet project. On the other hand, if I could learn iPhone development while getting paid, I could definitely shift some projects around. Being a weather app, this job matchup was too good to pass up; so I sent in my resume one morning back in March. That afternoon, the company got in touch with me for an interview, and the following week I flew out to their headquarters to get up to speed on the project.

The development cycle for this app was pretty quick. With the first deadline of a working demo only 3 weeks from the day I started, I really booked it and started pumping out code. My life was pretty much coding, from time I woke up until going to bed. A rough, but fairly good demo was completed, with 10k lines of code in those first 3 weeks. I had about a week off, which incidentally was the same week of my 30th birthday. It was great to take a little bit of time off, party with some friends, and enjoy life.

Then the second stage of the project kicked in, which needed to be completed in only 2 more weeks time. The second stage was definitely slower, so I was able to sleep a little bit more, and see Katrina from time to time. 🙂 The resulting stage 2 app was pretty polished. The company I’m working with has a few contacts at Apple, so they arranged to demo it in Cupertino. That was a couple of weeks ago and from what I heard, the demo went pretty well. All the work definitely paid off. You should be seeing this product hit the market some time this summer. I’ll definitely post more about this when the time comes.

Our Moke After all that work and Katrina’s semester coming to a close, we decided to take off on a vacation. We found a great deal on airfare and hotel down to Barbados, so we decided to jump on it. We spent last week on the south coast of the island soaking up the sun, learning the culture, having a blast driving around in our little moke (see photo), and just getting some good R&R. There’s not a ton of stuff to do on the island, but definitely enough to keep you occupied for a week or two. We toured one of the 14 Concorde jets in existence, visited some caves, walked a historical museum, snorkled with some sea turtles, and enjoyed some excellent food.

With a constant 15 mph trade wind, the surf on Barbados was better than any other Caribbean island I’ve visited. Furthermore, our hotel room opened up onto the beach, so I was able to walk about 50 feet from our patio and paddle out to bodyboard. Needless to say, several surf sessions took place that week.

With summer finally finding it’s way to central Michigan, the mountain biking season has now begun. Bodyboard being a fairly difficult activity in Michigan, mountain biking has become my main form of exercise. For the past 10 years, I’ve been riding a Trek hardtail. I’ve put over 3000 miles on it, and the gears are almost completely shot. So I was posed with a decision of either spending a couple hundred bucks on a new set of cogs, bearings, and a chain, or breaking down and purchasing a whole new bike.

I had been looking at getting a full suspension bike for the past few years, so I started visiting bike shops around here to ride some different models. I had hit every bike shop in a 30 mile radius, without any luck. Finally, while we were down in Lansing for the day, I checked a few bike shops down there and found my new ride. Of course the bike shop didn’t have the right frame size, so I had to order it.

New Bike

A week later, it arrived, and I picked it up the day after we got back from Barbados. So far, I love it. It’s a Trek Fuel EX 5.5 complete with disc brakes, 3-5 inches of adjustable travel in front, and 5 inches of travel in back. Clipless pedals were not included so I swapped mine out from the old bike. I also added a seat pack (with tools to fix a flat and a few other necessities) and installed a new speedometer. My previous bike was so old, that even with the full suspension upgrade and a much beefier frame, this bike is lighter than my last. This weekend will be the first time I take it on the trail…definitely looking forward to it.

Looking toward the summer, I’ll be headed out to WWDC in San Francisco next month. A lot of good parties are starting to fall into place, so it should be a fun week. After that, we’re heading over to camp in Yosemite for a few days before coming home and spending the rest of the summer here working.

A Weather Developer's Journey Begins!

A few months ago an idea came to me for a new domain name I should pick up. I wasn’t sure what I was going to do with the domain yet, or if I would even use it at all, but I wanted to jump on it because it was available. It struck me as a good name, and I was a bit surprised that it was still available.

Zoom forward to a couple of weeks ago, when a different idea for a new website came to mind. Oftentimes I post detailed entries here specifically about the weather or very weather-tech related articles when I run into issues while developing Seasonality. The thing is, I’m not sure my typical reader is interested in these postings, and occasionally I will refrain from posting about weather-related issues simply because I don’t think it would fit well in the *Coder blog.

But wait a second…I bought that domain awhile back, maybe that would work. Actually, it ended up being perfect for my new site idea. So I started working on it off and on. I’ve been happily using WordPress to host this blog for quite some time, so I decided to use the same platform for the new site. I set things up, customized a theme, and wrote a posting or two. I think now it is finally ready to be revealed, and I wanted to share. The domain is weatherdeveloper.com, and the site is called “A Weather Developer’s Journey.” With the full-time development of Seasonality 2 coming soon here, I thought it would be a perfect time to start a site like this, as I’ll be spending a lot of time trying to overcome issues with data hosting, manipulation, and weather visualization.

If you’re at all interested in checking it out, please do so (Website, RSS Feed). I’ll most likely post links from here to the first few articles on the new site, just to get the ball rolling. The first article talks about finding and putting to use 90 meter elevation data.

Server Colocation

I’m happy to say that after spending the past 5 months in the pipeline, Rio (the Gaucho Software Forecast Server) has now been moved to a colocation facility. The facility provides redundant power, cooling, and several connections to Tier 1 network providers, which should definitely increase the server’s overall reliability. Previously, this server was located at my office. I had it hooked up to a UPS battery backup that gave it about 30 minutes of runtime, but it’s a far cry from true redundant power. Also, over the past several months, it seems that my business network connection reliability has been slowly decreasing. This should fix that issue.

Rio in the rack...

Before moving the server, I thought it would be a good idea to add a bit more memory and hard drive space to the box. I bumped up the memory to the motherboard’s max of 4Gb, which gives some more breathing room for the virtual machines I’m running via VMware. I also added another 300GB hard drive and switched from a 3 disk RAID 5 configuration to a 4 disk RAID 10. I had been reading on PostgreSQL mailing lists that for configurations with less than 6-8 hard drives, RAID 10 was substantially faster than RAID 5. RAID 5 has always been infamously slow at writes, but the read speeds are pretty good in general, so I had my doubts. Well, my doubts were definitely unfounded, because this single hard drive upgrade has given a dramatic performance increase. Previously, when running the forecast generator (twice a day), the processing would take approximately 2 hours. Now, after adding the 4th disk and switching to RAID 10 using an XFS filesystem (more on this below), the same process takes only 1 hour and 10 minutes.

Rio inside...

Since I was starting with a fresh RAID partition, I thought I should put some time into looking at different Linux filesystems. I used Bonnie++ to perform the disk benchmarks using a temp file size of 16GB. Initially with a standard Ext3 filesystem:

Version  1.03       ------Sequential Output------ --Sequential Input- --Random-
                    -Per Chr- --Block-- -Rewrite- -Per Chr- --Block-- --Seeks--
Machine        Size K/sec %CP K/sec %CP K/sec %CP K/sec %CP K/sec %CP  /sec %CP
rio.gauchoso 16000M 50719  83 103617  41 43095  10 51984  76 117808  13 285.4   1
                    ------Sequential Create------ --------Random Create--------
                    -Create-- --Read--- -Delete-- -Create-- --Read--- -Delete--
              files  /sec %CP  /sec %CP  /sec %CP  /sec %CP  /sec %CP  /sec %CP
                 16  2730  98 +++++ +++ +++++ +++  3024  99 +++++ +++  9129  99

Not bad…just over 100MB/sec writes and 117MB/sec reads. Notice the random seeks/sec value of 285.4/sec and file creates of 2730/sec. On a database system, the disk heads are given quite a workout, reading from indexes and data tables all over the disk. So seeks/sec performance was important to me. Memory cache helps, but my database is around 30GB on disk, so caching only goes so far. Overall, the ext3 numbers sounded pretty good, but I didn’t have anything to really compare them to.

I decided to try out SGI’s XFS filesystem. After seeing several benchmarks online between filesystems like ReiserFS, JFS, Ext3, and XFS, it seemed that XFS almost always had the best performance in general, so I gave it a go. XFS has a lot of nice features, including tools to defragment a partition while the system is active, the use of B+ trees for volume indexes (resulting in a much greater efficiency when you have a ton of files in a single directory), and a pretty decent tool to repair broken filesystems as well. I reformatted the partition and ran Bonnie++ again with these results:

Version  1.03       ------Sequential Output------ --Sequential Input- --Random-
                    -Per Chr- --Block-- -Rewrite- -Per Chr- --Block-- --Seeks--
Machine        Size K/sec %CP K/sec %CP K/sec %CP K/sec %CP K/sec %CP  /sec %CP
rio.gauchoso 16000M 55354  79 126942  21 32567   6 47537  70 126927  14 415.8   1
                    ------Sequential Create------ --------Random Create--------
                    -Create-- --Read--- -Delete-- -Create-- --Read--- -Delete--
              files  /sec %CP  /sec %CP  /sec %CP  /sec %CP  /sec %CP  /sec %CP
                 16  3917   2 +++++ +++  2662   3  3102   5 +++++ +++  2612   3

Write speed increased a healthy 22% to almost 127MB/sec, and read speed increased almost 8% to a similar 127MB/sec rate. But look closer at the seeks/sec and creates/sec rates… Seeks/sec increased an incredible 45% to 415.8/sec, and file creates improved 43% to 3917/sec. The drawback? Deleting files is quite a bit slower, 71% slower to be exact. To me, this tradeoff was well worth the gains, as it’s fairly rare for me to be deleting lots of files on the server. I have noticed a slight performance degradation while using rm with a lot of files, but it’s still a RAID so performance is certainly acceptable.

It’s not good to use just a single benchmarking tool, so I checked the validity of a couple of Bonnie++’s claims with a few simple dd commands, this time using 8GB file sizes:

root@rio:/rio# dd if=/dev/zero of=/rio/bench bs=8192 count=1000000
1000000+0 records in
1000000+0 records out
8192000000 bytes (8.2 GB) copied, 63.7116 seconds, 129 MB/s
root@rio:/rio# dd if=/rio/bench of=/dev/null bs=8192 count=1000000
1000000+0 records in
1000000+0 records out
8192000000 bytes (8.2 GB) copied, 59.2728 seconds, 138 MB/s

Those match up pretty well on the writes, and reading from the disk sequentially with dd is even faster than Bonnie++ claims.

Overall, I’m pretty pleased with the upgrade. I’m even happier to have this server at a datacenter. This should give me a lot of room to grow with hosting more weather data for upcoming Seasonality features, and also gives me a good server to run some other services on as well.

Global Warming Questioned by British Documentary

A week or two ago, a new documentary called “The Great Global Warming Swindle” was brought to my attention on a friend’s blog. While I haven’t seen the documentary, an article written about the documentary by Thomas Sowell contains enough inaccurate statements to make me question the validity of the documentary it references. I have taken a number of meteorology classes. Much of meteorology focuses on radiation, which also happens to be the root of the entire global warming issue. (Global warming is just the earth and it’s atmosphere absorbing more radiative energy than it emits.) I’ve picked out a few choice quotes from the article, and debunk them below.

There is no question that the globe is warming but it has warmed and cooled before, and is not as warm today as it was some centuries ago, before there were any automobiles and before there was as much burning of fossil fuels as today.

This statement is simply incorrect. While it is true that the globe’s temperature varies over the course of thousands of years as a result of Milankovitch cycles and other causes, there has not been any time in the past 650,000 years when the global average temperature has been higher than it is today. What the author is most likely referring to is the Medieval Warm Period (800-1300 AD). During this time, the average temperature was higher than normal in the European region and perhaps part of northern Asia. However, when you compare the average global temperature of both the Medieval times and current day, the average temperature today is much higher than it was back in Medieval times.

According to these climate scientists, human activities have very little effect on the climate, compared to many other factors, from volcanoes to clouds.

According to the IPCC report released earlier this year (page 4), human activities have a higher effect on global climate than natural sources. Furthermore, natural sources such as volcanoes and clouds actually cool the earth rather than warm it.

Clouds both cool and warm the atmosphere. Low clouds like a thunderstorm or thick layer of uniform stratus clouds cool the earth quite a bit, and reflect radiation from the sun back into space. High clouds, like the thin haze-like cirrus will trap some of the radiation the earth is emitting and warm the atmosphere slightly. The prevalence of low and mid level clouds wins out here with a cooling effect.

As for volcanoes, eruptions do release ash and greenhouse gasses up into the stratosphere and this has been measurable after relatively recent eruptions like Mount St. Helens. Since volcanic eruptions are relatively rare, their contribution to global climate is minimal to the extent that I’m not sure why the article or the documentary even brought up the issue. Usually, the ash dominates by blocking solar radiation for a net cooling effect, and after an eruption locations nearby would notice solar dimming.

While the effects of clouds and volcanoes combined will cool the earth’s climate significantly, CO2 and methane from anthropogenic sources have an even larger warming effect. Furthermore, the article brings up these two sources as if they are causing global warming instead of humans, when in fact they are acting as a negative feedback to the damage humans are causing.

These climate scientists likewise debunk the mathematical models that have been used to hype global warming hysteria, even though hard evidence stretching back over centuries contradicts these models.

I’m not sure what “hard evidence” the author is referring to, but there are several climate models developed at various universities and scientific institutions worldwide. Most of these models are tested from 1960 on forward, because we have solid temperature measurements worldwide during this time period. All of the best models match the warming that actually occurred during this time frame, and this reinforces the validity of these models. Again, I refer to the IPCC report (page 11) that shows output from up to 14 different climate models matching temperatures over the past 100 years only when they include anthropogenic forcing as opposed to just including natural climate forcing.

What is even scarier than seeing how easily the public, the media, and the politicians have been manipulated and stampeded, is discovering how much effort has been put into silencing scientists who dare to say that the emperor has no clothes.

Hmm…I’ve actually read several news articles reporting the contrary: the government often censors scientific reports written that support the global warming problem.

Academics who jump on the global warming bandwagon are far more likely to get big research grants than those who express doubts — and research is the lifeblood of an academic career at leading universities.

Actually, these grants can be used as leverage by the government to silence scientists. Release your findings, we pull your funding. Most researchers choose a topic they are interested in exploring, rather than focus on topics that have grant money available. Grant money is available through the NSF for a multitude of scientific topics.

Furthermore, this assumption of researchers “going after all the money” for selfish reasons is only valid if the researchers themselves got to keep most of the money. This is far from the case, and oftentimes researchers will only receive money directly from the grant if they buy their salary out from teaching classes at a university. Researchers still don’t get any increase in salary, they just won’t have to teach as many courses.

In politics, even conservative Republicans seem to have taken the view that, if you can’t lick ’em, join ’em. So have big corporations, which have joined the stampede.

While there is nothing I can say to completely disprove this statement, I can say that global CO2 emissions from businesses in the U.S. increase every year… If big corporations were jumping on the global warming bandwagon, wouldn’t emissions decrease?

Furthermore, why is global warming a political issue? It’s known that the earth is getting warmer, and it is also known that if there are lower CO2 concentrations in the atmosphere, the earth wouldn’t be as warm (this has been scientifically proven, by simply looking at the chemical makeup of CO2 and how it reacts to different wavelengths of radiation)… Wouldn’t the logical next step be to try and reduce CO2 concentrations in the atmosphere to return temperatures to around normal? I don’t see this as a liberal vs. conservative issue.

The difference is that we have the hardest and most painful evidence that there was a Holocaust. But, for the global warming scenario that is causing such hysteria, we have only a movie made by a politician and mathematical models whose results change drastically when you change a few of the arbitrarily selected variables.

Actually, I think we have much more information about global warming than just models and a movie. Google Scholar shows about 152,000 articles on global warming

As for the “arbitrarily” selected variables, I would hardly call them arbitrary. I haven’t researched any one climate model so deeply that I would know how it works, but if the variables are anything like the hundreds of specified conditions used in meteorological forecast models, I would imagine they are comprehensive. Modeling is difficult, and at this point we just aren’t sure how some variables will change as the temperature increases. However, this only brings the models’ accuracy into question 100 years from now, and does not discredit the fact that the earth is warming because of anthropogenic forcing right now. From page 11 of the IPCC report, you can see that the model output matched the actual temperature much more closely when anthropogenic variables were taken into account. This alone is pretty significant evidence that the human race is effecting the earth’s climate beyond the natural environmental variations.

If you want to run a climate model on your own computer and learn more about what is involved, check out EdGCM.

If you would like to read more about the issues brought up in this documentary, here is a very lengthy discussion over on RealClimate.org (529 comments on the posting at this time). Large universal topics like this deserve to be debated. Fortunately, scientists have already discussed the issue extensively over the past couple of decades. Governments, corporations, and to some extent the media don’t like what has been discovered and are deciding to take issue with it. Unfortunately, this just delays actually doing anything to help the environment and return things to the way they were.

2007 Warmest Winter on Record

The National Climatic Data Center (NCDC) released their official report on the worldwide climate for February, and for this past winter in general which includes December – February. 2006-2007 has been found to be the record warmest winter in the past 100 years. The report has a lot of information relating to various aspects of climate such as precipitation and snow pack data, but I found one image to be particularly striking: the temperature anomalies for December – February…

So what does this image show exactly? Well, a whole lot of red… Seriously though, to generate this graph the NCDC took worldwide weather observations for December – February for a 30 year period from 1961 to 1990. Then, they took the combined worldwide weather observations for December – February of this year and compared the two. Temperature deviations are shown by different sized/colored dots for a given area.

What’s surprising is not only the large areas over land in the Northern Hemisphere that were 5°C (9°F) or warmer than normal, but also the expanse of warm temperatures worldwide. It’s fairly common for one section of the globe to be warmer or cooler than normal for a few months at a time, but this image is showing warmer than normal temperatures almost universally around the globe.

Some of this can be credited to the recent El Nino event we had this winter. El Nino does play a role, but it most certainly is not responsible for all the warming taking place. CO2 emissions are at an all-time high, and increasing at a faster rate than ever before. If we don’t do something to keep CO2 concentrations in the atmosphere in check, the earth will be a much different place to live 100 years from now.

What is dewpoint?

I just posted a new weather article over on the Gaucho Software Forums this morning. The article talks about what dewpoint is and how we can measure it. This is one I’ve been meaning to write for quite awhile now and I finally got a chance to spend some time on it.

If you have any comments on the article, feel free to post them on that forum topic. I have a few ideas for future articles but I want to make sure I don’t run out of topics. If you have any suggestions of topics I should cover, get in touch with me.

Older posts

© 2017 *Coder Blog

Theme by Anders NorenUp ↑