*Coder Blog

Life, Technology, and Meteorology

Page 3 of 25

A New Lens

I’ve been into photography for about 10 years now. Back in 2008 I took the step from a point and shoot up to a DSLR and bought a Canon 450D (aka the Rebel XSi). It’s been a great camera over the years, but for quite some time I have been feeling like I have outgrown it. I wanted a new camera with better low-light capabilities and all the extra options like micro focus adjustments and more megapixels to work with. Last week, I found a pretty sweet deal on a Canon 5D Mark II, so I ordered one.

Part of this upgrade was bittersweet though. For the past couple of years, my favorite lens has been the EF-S 17-55mm f2.8 IS. Though it’s not an L series lens, the image quality and sharpness is incredible. It’s just an excellent standard zoom lens for Canon crop sensor cameras. Unfortunately, with the 5D being a full frame camera, the 17-55mm simply won’t work with it.

I had to decide what to do for a new walk-around lens. I thought about going the all-primes route. I love my 50mm prime, and the image quality of primes is top-notch. But primes don’t fit my style of photography as well, so I decided to stick with the zooms. I was left with a few options. If I wanted to keep an f2.8 aperture, I could grab a 24-70mm f2.8L II, or buy the older 24-70mm f2.8L on the used market. The first version of that lens is pretty heavy. The second version is lighter, and crazy sharp, but costs over twice as much. If I decided that f4 would be a good enough aperture, then I could go for the just-announced 24-70mm f4L IS, or the 24-105mm f4L IS (which is the kit lens of the 5D Mark II). Finally, Tamron has a nice 24-70mm f2.8 VC lens, which is the only full frame lens in this zoom range that offers both a wide f2.8 aperture and image stabilization.

While the Tamron was tempting, I decided to stick with the Canon options. I also threw out the 24-70mm f2.8L II pretty quickly, because it’s well outside my budget.

That left the used 24-70mm f2.8L and the f4 zooms. Each of these lenses come with a compromise. The 24-70mm f2.8L doesn’t have IS, and is a bigger and heavier lens. I wasn’t sure I wanted to carry around that extra weight all the time. The 24-70mm f4L IS is a new lens, and is rumored to be sharper than both the others, but costs about $500 more. The 24-105mm f4L IS has the extra zoom range, but isn’t as sharp as the 24-70mm f4L IS. So I could go for speed, sharpness, or range. This is a pretty tough decision for a lot of photographers.

I decided to go for the 24-105mm f4L IS. From the reviews, it sounded to be about as sharp as the 24-70mm f2.8, while being a good amount less weight to carry around and offering more of a zoom range. That extra zoom range was a big deal to me, because often I’ll pack my 70-200mm lens while traveling, and with the extra reach of the 24-105, I might be able to leave that longer lens at home from now on.

After making my decision, I wanted to do an image comparison between the new 24-105mm lens and my current 17-55mm. So I setup a quick composition, mounted my XSi on a tripod, and started setting things up. The camera settings I decided on were JPEG images (to remove any raw processing variables), mirror lock-up enabled, triggered with a 2 second timer, and IS was disabled on both lenses. I manual focused using live view on the digit on the 1 ball and the exposure ended up being 0.5 seconds with an f4 aperture at IOS 100. These JPEGs are straight out of the camera, with the exception of the white balance. I chose the white balance settings from one of the photos in Aperture and stamped all the other photos with it so that all the images would start with the same color settings.

Test Setup

I took photos at f2.8, f4.0, 17mm, 24mm, 55mm, and 105mm. Of course both lenses weren’t capable of all those combinations, but I did every shot I could. Below I’m going to focus more on the 24mm and 55mm focal lengths, because those overlap between the two lenses.

24mm: On the 24-105mm lens, 24mm is as wide as it gets, so the 17-55mm has an advantage of not being all the way at the end of its zoom range. The 17-55mm is also stopped down at f4, giving it an even greater advantage here. So how did the 24-105mm lens do? Well, while both lenses produced similar results, the 17-55mm was a bit sharper and the 24-105mm lens produced richer colors.

17-55mm f4.0 at 21mm

55mm: Here the 17-55mm is all the way at the long end of its zoom range, while the 24-105mm lens is pretty much in the middle. The 17-55mm still has the advantage of being stopped down though. Here, the 17-55mm lens was still sharper, but not by as much as the 24mm frame. The color of the 24-105mm lens was still better, but not by as much as the 24mm frame. This comparison was probably the closest result, and I had to look pretty hard to find any difference at all between the frames.

24-105mm f4.0 at 55mm

Zooming the 24-105mm all the way out to 105mm resulted in a softer image. Some other reviews online mention the 24-105mm lens starts to get soft at around 90mm, and my 105mm shot definitely agreed with that.

How about some uneven comparisons between the two lenses? I looked at 24mm using f2.8 on the 17-55mm lens, and f4 on the 24-105mm. Here, the sharpness is about the same. If I had to pick a winner in this situation, the 17-55mm would be it, but not by much. Again, color on the 24-105mm was nicer.

Comparing the two at 55mm with the 17-55mm lens set at f2.8 and the 24-105mm lens set at f4 was the outlying result. Here the 24-105mm lens was sharper. Not by much, but it was noticeable. The 24-105 also retained its nicer color rendering.

The results were close enough (and only visible by pixel-peeping) that I don’t think I’ll notice the difference day to day with the new 24-105mm lens. But what an impressive showing by the 17-55mm. It’s just amazingly sharp, and I’ve loved using it over the past couple of years. If you’re looking for the best image quality on a standard zoom out of a crop sensor camera, you won’t find much better than the 17-55mm f2.8 IS.

What about other factors? Well the biggest difference in these two lenses is the build-quality. I didn’t take this as much into account when I originally bought the 17-55 (back then I compared it to the 24-105mm as well and decided I wanted the wider field of view). When I unpacked the 24-105mm lens earlier today, I was floored at how much more solid it felt. The zoom rings are really smooth, and the whole thing just feels really solid. I was expecting a difference between the two, but the build quality differences were much greater than I was expecting. The 24-105mm lens is also dust and weather resistant, so you’ll be in better shape if you are out in the elements with this one.

The other difference is the maximum aperture. With f2.8, the 17-55mm lens is much more flexible. It won’t play as big of a role in this switch for me though. Full frame cameras will typically be more sensitive to light because of the bigger image sensor. So I could easily bump my ISO up one stop on the 5D to get the equivalent light of the f2.8 on my 450D (and still have shots with less noise than on the 450D).

I have posted full resolution copies of all the photos to my Flickr account so you can compare them for yourself. You really can’t go wrong with either lens. They both have great image quality, one’s a little wider angle, ones a little longer, but overall their similarities far outnumber their differences.

The Surf Bike

Marlinspike’s blog entry “The Worst” is a good read, and calls to mind some lessons learned while I was earning my degree at UCSB.

On the UCSB campus, everyone bikes…and I mean everyone. I would guess there is around 20,000 bikes on campus, to the point where biking anywhere means navigating a ton of traffic. While I spent my time there, I had 5 different bikes.

The first two were bikes from childhood that I used during my Freshman year. I put so many miles on them, that eventually even after repairing parts that broke, they were pretty well worn out to the point that I needed a new bike.

So I bought my third bike, a Raleigh something or other. It was a pretty sharp looking bike. Nothing overly expensive, but nice enough that it was stolen about a year after I bought it. Having it stolen broke my heart, because I made sure to always lock it with one of those U-locks, and it was taken from the bike racks just outside my dorm room.

I decided from then on out to never trust bike locks. My fourth bike was a Trek, and it was the first bike I had that let me really get into mountain biking (which I still enjoy as a hobby today). It was more expensive than any of my other bikes, and for that reason, I never locked it up anywhere. I stored it in my dorm room (and later inside my apartment) when I was at home. On campus, I worked in the Engineering building, so I was able to bike to work and park the bike in my office there, just walking to the rest of my classes. It worked out pretty well, but as Marlinspike would say, the bike owned me.

Then about halfway into my Junior year a bike showed up on the back patio of our apartment. It was at least 20-30 years old, and half rusted out. It was the ugliest damn bike I have ever seen. To this day, we have no idea where it came from. We left it there for a couple of weeks, to see if anyone would find it and reclaim their property. Nobody did, so I moved the bike around to the front of our apartment and parked it in the bike rack. No lock, nothing.

The bike became our apartment’s “surf bike”, because it was perfect for when we wanted to go out surfing. There weren’t bike racks to use at our local surf spots, so usually we had to spend a lot of time walking to the ocean. With the surf bike, we didn’t need to lock it up, so we just took it to the beach and left it there while we were out, and rode it back when we were done. It was liberating.

I really started to enjoy the care-free attitude of the surf bike, so a few months later I started to use it as my daily ride too. For over a year, I rode it to campus and back every day, never locking it anywhere, and nobody ever took it. There were a few squeaks in the gearing, but it never broke down on me. It really was the perfect college bike.

I used the bike all the way through the end of senior year. When it was time to move home for the summer, it didn’t feel right to take it with. So we left it there, for the next fortunate person to discover and love.

DSC_0451
Photo by Ryon Edwards

Overhead while using GCD

Today I spent some time optimizing the Particle Mode simulation code in Seasonality Core. While doing some measurements, I discovered that quite a bit of time was spent in GCD code while starting new tasks. I use dispatch_apply to iterate through the particles and run the position and color calculations for the next frame. In the tests below, I was simulating approximately 200,000 particles on the Macs, and 11,000 particles on the iPad.

I decided to try breaking the tasks up into fewer blocks, and run the dispatch_apply for groups of around 50 particles instead of running it for each particle. After making this change, the simulation ran in up to 59% less CPU time than before. Here are some informal numbers, just by looking at Activity Monitor and roughly estimating:

  CPU Usage
Device   Before   After   Time Savings
Mac Pro (2009, Oct 2.26Ghz Xeon)   390%   160%   59%
Retina MBP (2012, Quad 2.6Ghz i7)   110%   90%   18%
MacBook Air (2011, Duo 1.8Ghz i7)   130%   110%   15%
 
iPad 3 (fewer particles)   85%   85%   0%

As you can see, the benefits from the new code running on the Mac Pro are substantial. In my earlier code, I was somewhat suspicious of why the simulation took so many more resources on the Mac Pro than on the laptops. Clearly the overhead in thread creation was a lot higher on the older Xeon CPU. This brings the Mac Pro’s processing times closer to what the other more modern processors can accomplish.

Perhaps an even more surprising result is the lack of a speedup on the iPad. While measuring both runs, the two versions averaged about the same usage. Perhaps if I had a more formal way to measure the processing time, a small difference might become apparent, but overall the difference was minimal. I’m guessing that Apple has built logic into the A-series CPUs that allows for a near 0 cost in context switching. Makes you wonder how much quicker something like this would run if Apple built their own desktop-class CPUs.

Race to the Bottom

A few years ago, developers of mobile apps were on a “race to the bottom.” When the iPhone App Store opened, apps were priced reasonably, but in the months that followed developers dropped prices to remain competitive until the most popular price point became just $0.99.

Today I read this article on Engadget about Amazon and Google selling their latest tablets at or below cost. Sounds familiar…

Honestly, I think the only reason Google and Amazon are doing so well in the tablet market is because they have this no-profit hardware model. Apple is the big player here, and has always priced their products to have a healthy hardware profit margin. There were several earlier Android-based tablets before the Nexus series and the Kindle Fire, all priced in the same ballpark as the iPad. Unfortunately (for these other companies), customers decided that at the same price point they would rather have an iOS device.

Amazon realized the only way they could break into the market would be to sell at cost, and make up the profits when customers purchased content to read or watch. Since Amazon also sells that same content, they are in a good position to turn a profit using this model.

The problem is consumers are now expecting all hardware to be sold at this lower price. Just look at the reaction to Apple’s recent $329 starting price of the iPad Mini. The Mini’s starting price is $80-130 more expensive than Android-based competitors, and people have had something to say about it. In reality, Apple is just pricing it in a way where it can sustain their business.

I don’t place the blame on consumers for this attitude. The latest mobile craze has people buying the newest models year after year. Corporations like Apple and Google are pushing for this consumerism attitude, so they can continue to sell record numbers of devices and maximize profit. The thing is, many people don’t need and can’t afford to buy the latest tablet/phone/etc year after year (at least not at prices with a healthy profit margin). So consumers continue to pressure the corporations to sell at lower prices. In this respect, this consumerism attitude the corporations have pushed has backfired on them.

The tablet market is still young. Last year, I thought Apple had such a big lead with the iPad, that it would be very difficult for other platforms to catch up. But a lot has happened in the last 12 months. It should be interesting to see just how this plays out in the long term.

Living in a Sandboxed World

No matter what your view on Apple’s new sandboxing requirement for the Mac App Store, if you want to keep updating your MAS apps, you’re going to need to sandbox them. I was able to sandbox Seasonality Core pretty easily. I don’t access any files outside of Application Support and the prefs file. My entitlements just require outgoing network connections, which was pretty easy to enable in the target settings.

However, I distribute two versions of Seasonality Core. One is the Mac App Store application, and the other is a version for my pre-Mac App Store customers. The question arose: should I sandbox the non-Mac App Store application? I wanted the answer to this question to be yes, but unfortunately the serial number licensing framework I am using kept me from doing this. So I was forced to sandbox the Mac App Store version, but keep the non-Mac App Store version outside the sandbox. Crap.

You might be wondering what the big deal is here. Can’t my Mac App Store customers just use one app, and pre-Mac App Store customers use the other one? Well, yes, but there are a few situations where some customers might use both versions of the app.

If someone uses both the Mac App Store version and the non-Mac App Store version, things go south quickly. The first time the sandboxed Mac App Store version is run, all of Seasonality Core’s data files will be migrated into the sandbox. That means the next time the non-Mac App Store version is opened, it won’t be able to see any of the past data Seasonality Core has collected. That’s not good.

So how did I get around this? After taking a quick poll on Twitter, it sounded like the best option for me would be to have the non-Mac App Store version look reach inside my app’s sandbox if it existed. To do this, I just had to build some extra code into the method that returns my Application Support path. Here’s the new implementation:

+ (NSString *) seasonalityCoreSupportPath { NSFileManager *fm = [NSFileManager defaultManager]; #ifndef MAC_APP_STORE // Check if ~/Library/Containers/BundleID/Data/Library/Application Support/Seasonality Core exists. NSString *sandboxedAppSupportPath = [NSString pathWithComponents: [NSArray arrayWithObjects:@"~", @"Library", @"Containers", [[NSBundle mainBundle] bundleIdentifier], @"Data", @"Library", @"Application Support", @"Seasonality Core", nil] ]; sandboxedAppSupportPath = [sandboxedAppSupportPath stringByExpandingTildeInPath]; BOOL isDir; if ([fm fileExistsAtPath:sandboxedAppSupportPath isDirectory:&isDir]) { // We found a sandboxed Application Support directory, return it. if (isDir) return sandboxedAppSupportPath; } #endif NSArray *appSupportURLs = [fm URLsForDirectory:NSApplicationSupportDirectory inDomains:NSUserDomainMask]; NSString *appSupportDirectory = nil; if (appSupportURLs.count > 0) { NSURL *firstPath = [appSupportURLs objectAtIndex:0]; appSupportDirectory = [firstPath path]; } return [appSupportDirectory stringByAppendingPathComponent:@"Seasonality Core"]; }

 

The new code only runs if the MAC_APP_STORE isn’t defined (these are project definitions I have set elsewhere for the different builds). We check to see if there is a sandbox for the app, and if so it will return the sandboxed directory. Otherwise it returns the standard Application Support directory.

This is a pretty complete solution, except that I wanted to make sure the user’s preferences were saved between the two app versions as well. NSUserDefaults won’t know to check for the existence of a sandbox. Daniel Jalkut gracefully offered this solution, which I have since adapted into my own code as follows:

+ (BOOL) gsImportNewerPreferencesForBundle:(NSString *)bundleName fromSandboxContainerID:(NSString *)containerID { BOOL didMigrate = NO; NSArray *libraryFolders = NSSearchPathForDirectoriesInDomains(NSLibraryDirectory, NSUserDomainMask, YES); if (libraryFolders.count) { // Get a path to our app's preference file. NSString *prefsFile = [NSString pathWithComponents:[NSArray arrayWithObjects: [libraryFolders objectAtIndex:0], @"Preferences", bundleName, nil ]]; prefsFile = [prefsFile stringByAppendingPathExtension:@"plist"]; // Get a path to the same preference file in the given sandbox container. NSString *containerPrefsFile = [NSString pathWithComponents:[NSArray arrayWithObjects: [libraryFolders objectAtIndex:0], @"Containers", containerID, @"Data", @"Library", @"Preferences", bundleName, nil ]]; containerPrefsFile = [containerPrefsFile stringByAppendingPathExtension:@"plist"]; NSFileManager* fm = [NSFileManager defaultManager]; if ([fm fileExistsAtPath:containerPrefsFile]) { NSDate *prefsModDate = [[fm attributesOfItemAtPath:prefsFile error:nil] objectForKey:NSFileModificationDate]; NSDate *containerModDate = [[fm attributesOfItemAtPath:containerPrefsFile error:nil] objectForKey:NSFileModificationDate]; if ((prefsModDate == nil) || ([prefsModDate compare:containerModDate] == NSOrderedAscending)) { // Copy the file. [fm copyItemAtPath:containerPrefsFile toPath:prefsFile error:nil]; // Reset so the next call to [NSUserDefaults standardUserDefaults] // recreates an object to the new prefs file. [NSUserDefaults resetStandardUserDefaults]; NSLog(@"Found newer preferences in %@ - importing", containerPrefsFile); didMigrate = YES; } } } return didMigrate; }

 

I call the above preferences migration code directly from main(), so it executes before the any part of the main app might hit NSUserDefaults. Works pretty well thus far.

Moving back a generation

This week, I’m doing something rare and actually stepping back a generation of hardware. I usually try to keep the latest and greatest around here, within reason, but with this I just can’t help it.

Back in 2008 I purchased a Cisco ASA 5505 firewall/router. It has worked perfectly since then, and I probably only use 10% of its amazing feature set. I have it configured to forward a bunch of ports (using NAT/PAT), provide VPN service for my devices while I’m out of the office, and do basic packet inspection to avoid DoS attacks and other issues. The router has never once crashed on me and has stayed online for hundreds of days at a time without any issue.

So why am I replacing it? Well, it turns out that Cisco’s licensing absolutely cripples the 5505. I have a 10 user license, which I thought would be plenty when I bought it. Of course, this was before all the extra mobile devices, game devices, webcams, and printers were added to the network. I quickly passed this 10 device limit and am well on my way to three times that. Everything has WiFi built-in these days, and 10 devices just doesn’t cut it anymore.

I looked into what it would cost to upgrade the ASA to a 50 user license and an unlimited license. The upgrade to a 50 user license is around $250, and the unlimited license is a $350 upgrade. That’s more than I spent on the router hardware itself.

For the past couple years, I’ve gotten around the limitation by segmenting the network. I put my main systems (development Mac, the file server, etc) on the primary network connected to ASA, and have connected everything else to a second subnet that uses an Airport Extreme as a gateway. So the ASA only sees a few devices on the primary network, and everything else hides behind the Airport. This works pretty well, but the Airport Extreme bottlenecks communication between the two subnets, and devices on the primary network can’t connect to devices on the secondary network.

I’m tired of it. So this week when I saw someone on Craigslist was selling a PIX 515e firewall, I jumped at the chance to have an unrestricted network. Even though the PIX is a few years older, it’s a higher-end model so it can handle 50% more bandwidth than the ASA (up to 190 mbps). If I ever wanted to segment the network again, the PIX supports up to 25 VLANs. And the previous owner added a memory upgrade, so it runs the same OS version that my newer ASA has. There really isn’t a drawback I can see.

Of course I am still keeping the Airport Extreme on the network. I definitely don’t want to give up wireless. But now the Airport can act as a bridge and allow two-way traffic between wired and wireless clients. I also brought all wired devices from the secondary network back onto the primary, where they can talk to each other directly using a 24 port gigabit smart switch. It is a much faster and cleaner setup.

Here’s a shot of the home network rack since the upgrade.

Home Office Rack

Yet another iPad 3 Review

I thought I would throw an iPad 3 review together, now that I’ve had about a week to use it. There are a bunch of features I won’t mention, either because it’s not a key feature to me, or because it’s not available in my area (4G LTE).

Retina Display There are no words to fully describe it. The sharpest computer display I’ve ever seen. Photos look like glossy prints. Text looks better than a magazine. It’s not just prettier though, it’s easier on the eyes as well. Try reading all day on a standard resolution iPad, and then again on the iPad 3, and you’ll be a lot less fatigued after reading on the iPad 3.

The only drawback of the Retina Display is that I’ve found you have to be careful when picking home screen backgrounds (yep, such a 21st century “issue”). Pick a photo with too much detail in it, and it will be so sharp as to actually distract from finding the app you’re looking for. Choose your background images carefully.

Memory This is the second biggest feature, to me. This iPad has 1024MB of RAM instead of 512MB on the iPad 2 (or 256MB on the iPad 1). This means more apps stay open in the background, and switching between recently used apps happens a lot faster. I’ll often switch to an app I haven’t used for several hours, expecting it to need to relaunch, only to find that it’s still in memory. It’s a big difference.

From a development standpoint, this iPad would have been crippled if Apple stuck with 512MB of RAM. With the new display, just to show an image full screen takes 12.5MB of RAM. Even if you wanted a simple app to show a 10 frame animation, that’s 125MB of memory gone. With Seasonality Go, I’m seeing about 4-5 times the memory usage when using a full screen radar map with animation. With the iPad 3 though, it’s not a problem, there’s still plenty of memory to spare.

“This Runs Warm” The press (most notably Consumer Reports) has made a big stink about how the new iPad runs warmer than the old one. Does it? Yep. Does it matter? Nope. It feels warm only when playing games or doing something equally intense. During typical usage, it stays cool. Mine has never gotten what I would call “hot” or even uncomfortable. What do people expect when you pack a faster processor, bigger battery, LTE networking, and a screen that requires twice as many backlights all into a device the same size as the last generation? It’s going to run warmer, that’s all there is to it.

There is one drawback to this, and it’s more of a psychological issue. When using an iPad, if it gets warm, you start to realize the tablet is really like any other computer with hot components and things inside that are “working hard”. My iPad 1 always stayed cool to the touch while I was using it, and I think that made it feel like less of a computer and more of a pane of glass with some information I could interact with. With an iPad that gets warm, some of the magic is gone.

Ergonomics Coming from the iPad 1, this is the biggest drawback. Yes, the iPad 3 is thinner, and that is nice. But Apple also decided to taper the edges (this happened first with the iPad 2), resulting in there being no real “sides” to the device. This makes it difficult to push any of the buttons to adjust the volume or turn the device on/off. It’s easy to hit a button on a flat side, but difficult when you have to come from the back at a 45° angle. A similar change happened on the iPod touch a few generations back, and I can’t wait until Apple decides that having sides is cool again.

Camera This iPad has a new camera on the back. From what I’ve read, it’s the 5 megapixel image sensor from the iPhone 4, mated to the new lens elements in the iPhone 4S. The resulting images are impressive, but they’re not going to take the place of a digital camera (unless you only take pictures outside on a sunny day, I suppose). Still, I’m glad I have one more 1080p video camera close by for when my 1-year-old daughter does something new.

Summary Overall, this iPad is the best one ever, but you already knew that. Should you upgrade? If you have an iPad 1, then by all means buy a new iPad 3. You’ll love it, and it will feel so much more refreshing with the speed bump and the amazing display.

The decision isn’t as easy if you have an iPad 2, of course. If I had an iPad 2, I would wait and upgrade once the iPad 4 comes out next year. But that’s just me.

Apple’s Mythical iTV

I don’t know if Apple is planning to release an “iTV” at some point, but this is the wrong approach competitors should be taking. According to Chris Moseley, AV product manager at Samsung:

We’ve not seen what they’ve done but what we can say is that they don’t have 10,000 people in R&D in the vision category.

They don’t have the best scaling engine in the world and they don’t have world renowned picture quality that has been awarded more than anyone else.

TVs are ultimately about picture quality. Ultimately. How smart they are…great, but let’s face it that’s a secondary consideration. The ultimate is about picture quality and there is no way that anyone, new or old, can come along this year or next year and beat us on picture quality.

He makes a good point: TVs are about picture quality. The thing is, I’ve never seen another brand of computer monitor (perhaps high-end NEC displays, but those are on a different pricing level) hold a candle to the picture quality you get from any of Apple’s modern LCDs. Picture quality just isn’t going to be an issue if Apple ever ships an iTV.

The quote above reminded me a lot of a statement Palm’s then-CEO Ed Colligan made back in 2006 about the rumored iPhone:

“We’ve learned and struggled for a few years here figuring out how to make a decent phone. PC guys are not going to just figure this out. They’re not going to just walk in.

We all know how that turned out…

Apps for Water

It’s important to me that Gaucho Software, as a company, gives back to society as a whole. This manifests itself in a variety of ways, from providing the app XRG for free (and open-sourcing the code), to donating money to different charitable organizations. With the holiday season upon us, I wanted to take this opportunity to give again, but in a bigger way than ever before. So today I’m announcing a new campaign called Apps for Water.

Apps for Water gives app developers a chance to donate one day of their sales to a very worthy cause: providing clean water for developing nations. It seems like a simple thing, turn on a faucet and clean water pours out. Unfortunately, there are almost 1 billion people in the world who don’t have a safe, clean water source. This isn’t just a minor inconvenience. It turns out, the lack of safe water is one of the leading causes of death worldwide. Over 2,000,000 people die each year from diseases stemming from the lack of clean water.

This is where organizations like charity: water come in. The goal of these organizations is to provide safe water for communities in developing nations. Believe it or not, it only costs about $20 to give a new person access to clean water. And because of a collection of private donors that pay for the day-to-day operations of charity: water, every dime that is donated by 3rd parties to their organization goes directly toward water projects.

So here’s the deal. Gaucho Software will be donating 100% of the proceeds from app sales on December 20th, 2011 to charity: water. Help fund a water project, get an app for free. Furthermore, as an additional incentive to donate, Seasonality Core and Seasonality Go will be on sale that day for 20% off.

A call to app developers…

But the hope is that my offer will be just the tip of the iceberg. I would like to formally invite all other developers of Mac and iOS apps to participate in this campaign. We’re setting up a new website1 at AppsForWater.com to list all the apps that will be giving discounts and donating their proceeds that day to charity: water. Then we’ll get the word out, and tell everyone about how they can give clean water to communities in need.

If you are a developer who is interested in participating, please send me an email or ping me on Twitter. I will do my best to answer any questions and will let you know what materials I need to have your apps listed.

It’s been shown time and time again that the Mac and iOS community can make a big difference in the world. Please help us do it again.

 

1 I want to send a special thanks to François Joseph de Kermadec and the rest of the team at Trumper, Forsyth & Kline for so generously donating their time to develop the website for this project.

Using IOKit to Detect Graphics Hardware

After Seasonality Core 2 was released a couple of weeks ago, I received email from a few users reporting problems they were experiencing with the app. The common thread in all the problems was having a single graphics card (in this case, it was the nVidia 7300). When the application launched, there would be several graphics artifacts in the map view (which is now written in OpenGL), and even outside the Seasonality Core window. It really sounded like I was trying to use OpenGL to do something that wasn’t compatible with the nVidia 7300.

I’m still in the process of working around the problem, but I wanted to make sure that any work-around would not affect the other 99% of my users who don’t have this graphics card. So I set out to try and find a method of detecting which graphics cards are installed in a user’s Mac. You can use the system_profiler terminal command to do this:

system_profiler SPDisplaysDataType

But running an external process from within the app is slow, and it can be difficult to parse the data reliably. Plus, if the system_profiler command goes away, the application code won’t work. I continued looking…

Eventually, I found that I might be able to get this information from IOKit. If you run the command ioreg -l, you’ll get a lengthy tree of hardware present in your Mac. I’ve used IOKit in my code before, so I figured I would try to do that again. Here is the solution I came up with:

// Check the PCI devices for video cards. 
CFMutableDictionaryRef match_dictionary = IOServiceMatching("IOPCIDevice");

// Create a iterator to go through the found devices.
io_iterator_t entry_iterator;
if (IOServiceGetMatchingServices(kIOMasterPortDefault, 
                                 match_dictionary, 
                                 &entry_iterator) == kIOReturnSuccess) 
{
  // Actually iterate through the found devices.
  io_registry_entry_t serviceObject;
  while ((serviceObject = IOIteratorNext(entry_iterator))) {
    // Put this services object into a dictionary object.
    CFMutableDictionaryRef serviceDictionary;
    if (IORegistryEntryCreateCFProperties(serviceObject, 
                                          &serviceDictionary, 
                                          kCFAllocatorDefault, 
                                          kNilOptions) != kIOReturnSuccess) 
    {
      // Failed to create a service dictionary, release and go on.
      IOObjectRelease(serviceObject);
      continue;
    }
				    // If this is a GPU listing, it will have a "model" key
    // that points to a CFDataRef.
    const void *model = CFDictionaryGetValue(serviceDictionary, @"model");
    if (model != nil) {
      if (CFGetTypeID(model) == CFDataGetTypeID()) {
        // Create a string from the CFDataRef.
        NSString *s = [[NSString alloc] initWithData:(NSData *)model 
                                            encoding:NSASCIIStringEncoding];
        NSLog(@"Found GPU: %@", s);
        [s release];
      }
    }
		    // Release the dictionary created by IORegistryEntryCreateCFProperties.
    CFRelease(serviceDictionary);

    // Release the serviceObject returned by IOIteratorNext.
    IOObjectRelease(serviceObject);
  }

  // Release the entry_iterator created by IOServiceGetMatchingServices.
  IOObjectRelease(entry_iterator);
}
« Older posts Newer posts »

© 2022 *Coder Blog

Theme by Anders NorenUp ↑