Ben Scheirman

These fickle bits...

Menu

Fixing GitHub SSL Issue on 10.9.2

By now you’ve most likely heard about the egregious SSL flaw that has existed in OSX and iOS for a while now.

Yesterday, Apple (finally) released 10.9.2 which addressed the flaw, as well as some other features. Upon upgrading, I was more than slightly frightened to see this error when trying to open github.com:

Using Rbenv in Cron Jobs

When using rbenv on your server, you need to make sure that any gem command run needs to be executed with rbenv initialized. When you install rbenv locally or on the server, you typically have something like this added to your .bashrc:

1
2
if which rbenv > /dev/null; then eval "$(rbenv init -)"; fi
...

Synchronizing Dotfiles

I’d put this off for far too long, but I finally released my dotfiles on Github. Part of the reason it took me a while is I already had a syncing solution in Dropbox. I’m still using Dropbox to synchronize between my own machines, but I now have them published on Github as well. In addition, I wrote a handy script (with some inspiration from Steve Harman’s dotfile setup) that symlinks the files into the right spot on the target machine:

Creating a Fusion Drive

Fusion Drive

I have a Late 2009 Core i7 27” iMac that was starting to feel old. Many seemingly simple tasks would cause the OS to beachball, which generally made me not want to use the computer.

This slowness occurred despite the machine having a still respectable 2.8 GHz Core i7 processors. In fact, running some benchmarks with Geekbench led me to believe the problem didn’t lie with my CPU; it was my disk.

Unfortunately Geekbench doesn’t have any disk benchmarks, so I used the relatively old Xbench as a baseline. Compared with my Retina MacBook Pro with a 512 GB SSD, this drive was painfully slow.

I’d read about SSD upgrades in the 27” iMac but I was faced with the ultimate trade-off: The raw speed of the SSD with the utter capacity of a traditional hard drive.

75 Essential Tools for iOS Developers

If you were to go to a master woodworker’s shop, you’d invariably find a plethora of tools that he or she uses to accomplish various tasks.

In software it is the same. You can measure a software developer by how they use their tools. Experienced software developers master their tools. It is important to learn your current tools deeply, and be aware of alternatives to fill in gaps where your current ones fall short.

With that in mind, I present to you a gigantic list of tools. Some of these I use daily, others I see potential in. If you have more tools you’d like to see here, just make sure to add a comment.

Speaking at Cocoa Conf PDX

I have the pleasure of speaking at Cocoa Conf PDX on August 15-16th. Cocoa Conf is always a great event, and I especially love traveling to my home state of Oregon. This time around I’ll be giving two talks:

  • The iOS Developer’s Toolbelt
  • Jenkins - Your personal butler for iOS automation

I hope to see you there!

In Search of a Fast External Hard Drive

Ever since I upgraded to a retina MacBook Pro, I knew I’d have to come up with a new strategy for storing data. Even after upgrading to the 512GB SSD, I’m still running out of space. With hundreds of gigabytes for pictures, music, videos, and games a 512GB SSD is perfectly reasonable. But now that NSScreencast is nearing a year old I have more data than I can store on a single drive. Another nuissance was transferring these videos over to my iMac for editing.
A typical 20 minute screencast of mine will eat up nearly 8 gigabytes before encoding, and transfering a file like this over Wi-Fi is painfully slow.

On my previous MacBook Pro I opted to remove the superdrive and install a 2nd 7200 RPM drive for larger storage. This worked well, but the retina MacBook Pro has no such capability, so I went on the lookout for an external drive to store NSScreencast videos.

Houston Code Camp - Call for Speakers

Houston Code Camp 2012 is happening on August 25th at the Microsoft offices in Houston, TX.

This is going to be a great event! Last year we had sessions on Ruby, C#, JavaScript, iOS, and lots more. It’s definitely a polyglot event where there is something for everyone.

We’re looking for speakers, so if you’re interested in speaking, please submit a topic. I hope you’ll consider being a part of it.

Registration is not yet open, but keep a look out on twitter or this blog for more details.

Serving Assets From S3 on Heroku

Recently I changed NSScreencast to use a CDN to serve up assets from a different, faster server.

Why use a CDN?

Using a CDN has numerous benefits. First, and foremost, this alleviates a bunch of secondary requests that would normally hit your webserver. Loading the home page of NSScreencast loads more than a dozen images, stylesheets, and javascript files. When deploying to Heroku this can be especially problematic as each asset request will occupy one of your dynos for a short time. In the spirit of maximizing your free dyno on Heroku, not sending these requests to your app is definitely a big win.

In addition, most browsers have a setting that limits the number of connections (usually 2) that it will open in parallel to a given domain. By using a CDN, you can increase the number of parallel requests because these assets are not served up by your application’s domain.

It’s also a common practice to use dns to “alter” the domain so that you can maximize this parallelization.

Using the asset sync gem

Following the instructions on Heroku’s Devcenter article I decided to use the asset_sync gem. This gem will upload your compiled assets to your preferred CDN (any file storage server that fog supports). In my case, I wanted to use S3.

The first step is adding this gem to your Gemfile:

1
2
3
4
group :assets do
  # other asset gems
  gem 'asset_sync'
end

It’s important to put this in your asset group, as your running app doesn’t need to load this into memory.

Then you need to configure the gem. I found Heroku’s instructions to be lacking here, as I had to dig into the asset_sync github page to make this work.

Add a file called config/initializers/asset_sync.rb to your app:

1
2
3
4
5
6
7
8
9
10
11
12
13
# Since this gem is only loaded with the assets group, we have to check to 
# see if it's defined before configuring it.
if defined?(AssetSync)
  AssetSync.configure do |config|
    config.fog_provider = 'AWS'
    config.aws_access_key_id = ENV['AWS_ACCESS_KEY_ID']
    config.aws_secret_access_key = ENV['AWS_SECRET_ACCESS_KEY']
    config.fog_directory = ENV['FOG_DIRECTORY']

    # Fail silently.  Useful for environments such as Heroku
    config.fail_silently = true
  end
end

That last config line is important. When you deploy to Heroku, your app’s assets will get precompiled. But because Heroku doesn’t initialize your app on precompile, none of your settings will be available. Instead we’ll have to run the precompile again, manually, to get AssetSync to kick in.

Setting up the configuration with Heroku San

Since I like to have multiple environments, I use heroku_san to manage them, including the environment variables.

Inside of config/heroku.yml, set up the following for each environment:

1
2
3
4
FOG_PROVIDER: "AWS"
FOG_DIRECTORY: "nsscreencast-assets"
AWS_ACCESS_KEY_ID: "<your access key>"
AWS_SECRET_ACCESS_KEY: "..."

Configuring Your Rails app to use S3 as an Asset Host

In your config/production.rb (and staging.rb if you have one), make sure to add the following line to allow Rails to generate the appropriate links for your assets:

1
2
3
4
  config.action_controller.asset_host = Proc.new do |source, request|
    scheme = request.ssl? ? "https" : "http"
    "#{scheme}://#{ENV['FOG_DIRECTORY']}.s3.amazonaws.com"
  end

This will allow your app to serve up the URLs using SSL if the request is coming via SSL. Doing this can avoid warnings in the browser that your app contains secure and unsecure content.

Testing it all out

If all of this is configured correctly, you can test it out by doing a push…

1
git push heroku master

You’ll see the asset precompile going on in the logs, and likely an error related to AssetSync. This is fine (and in fact, this tripped me up at first). Once the deploy has completed, you’ll have to run this command to upload your assets:

1
heroku run rake assets:precompile --app <yourapp>

Doing this, you should see something like the following output:

1
2
3
4
5
6
Precompiling assets...
Uploading: application.css
Uploading: application.css.gz
Uploading: image1.png
Uploading: image2.png
...

Set up Heroku San to do this on every deploy

I’d likely forget to run this command every once in a while, so I set up Heroku San to run this command after every deploy.

To do this, add a new rake task in your app (lib/tasks/deploy.rake):

1
2
3
4
5
6
task :after_deploy do
  HerokuSan.project.each_app do |stage|
    puts "---> Precompiling asssets & uploading to the CDN"
    system("heroku run rake assets:precompile --app #{stage.app}")
  end
end

Now when you run your deploy via rake production deploy this will happen automatically.

So what’s the net result?

Doing this alleviated nearly 30 secondary requests to my application for each page load. That alone is pretty huge. Also, S3 is much faster at serving these assets than nginx is (at least via a Heroku app on 1 dyno).

I tested this before and after by clearing the cache and doing a fresh page load. Using the Chrome Inspector, I looked at the time to load the page and all assets. Here are my findings:

Before (serving assets with no CDN) 3.27 seconds
After (using S3 as a CDN) 1.07 seconds

That’s a huge gain for a minor change in your application & deployment process.