Meet the SLAs

Building Rails for scale

Dinshaw Gobhai | dgobhai@constantcontact.com

@tallfriend

github.com/dinshaw/meet-the-slas

dev-setup.md

            
            1  ## Install prerequisites
            2     brew install mysql

            ...   ...

            1002
            1003  ## Run the app
            1004   rails s
            
          

The golden path

Building for scale?

Don't do it.

Don't over architect.

Don't prematurely engineer.

Don't solve problems you don't have yet.

Measure

Cell Architecture

iRule to direct requests by :account_id

8 Torquebox instances (96 app servers)

1 Redis Server

1 Memcached Server

1 MySQL master, 1 slave

Api GET: 500 contacts json

SLA: < 500ms

Actual: ~ 44s

---

(3M / 500) * 44s = 3 days

(3B / 500) * 44s = 8.4 years

Profiling & Benchmarks

Ruby-prof/Dtrace

Benchmark

          
  def count
    Benchmark.measure "Count people" do
      ContactsSelector.count_people(params)
    end
  end
          
          

Rails & N+1

Preload, Eagerload, Includes and Joins

.preload(:association)

Separate queries for associated tables.

.eager_load(:association)

One query with all associations 'LEFT OUTER' joined.

.includes(:association)

Picks one of the above.

.joins(:association)

One query with all associations 'INNER' joined.

Advanced aRel

https://www.youtube.com/watch?v=ShPAxNcLm3o

            
  .where(
    "author.name = ? and posts.active = ?",
    "Jane", true
  )

  Post
    .joins(:comments)
    .joins(Comments.joins(:author).join_sources)
    .where(
      Author[:name].eq('Jane')
      .and(Post[:active].eq(true))
    )
            
          

Api GET: 500 contacts json

SLA: < 500ms

Actual: ~ 26s

Composite Primary Keys

Rails anti-pattern?

Multiple-column index is faster than multiple indexs

            
  class Contact > ActiveRecord::Base
    primary_key [:account_id, :contact_id]
    ...
  end
            
          

github.com/composite_primary_keys

percona.com/multiple-column-index-vs-multiple-indexes

In-line caching

Remember a previous method lookup, directly at call site.

In-line caching (monomorphic)


            class Foo
              def do_someting
                puts 'foo!'
              end
            end

            # first time does a full lookup of .do_something
            # stores "puts 'foo!'" at the call site, or 'in-line'
            foo = Foo.new
            foo.do_something

            # second time
            # knows to just run "pust 'foo!'"
            bar = Foo.new
            bar.do_something
          

In-line caching (polymorphic)


              class Foo
                def do_something
                  puts 'foo!'
                end
              end

              class Bar
                def do_something
                  puts 'bar?'
                end
              end

              # Worst case scenario for Monomorphic
              [Foo.new, Bar.new, Foo.new, Bar.new].each do |obj|
                obj.do_something
              end

              case obj.class
              when Foo; puts 'foo!'
              when Bar; puts 'bar?'
              else # lookup ...
              end
            

Megamorphic!

github.com/charliesome/...rubys-method-cache

            
  # composite_primary_keys/relation.rb
  def add_cpk_support
    class << self
      include CompositePrimaryKeys::ActiveRecord::Batches
      include CompositePrimaryKeys::ActiveRecord::Calculations
      ...

  # patch
  class CompositePrimaryKeys::ActiveRecord::Relation < ActiveRecord::Relation
    include CompositePrimaryKeys::ActiveRecord::Batches
    include CompositePrimaryKeys::ActiveRecord::Calculations
    ...

  class ActiveRecord::Relation
    def self.new( klass, ... )
      klass.composite? ?
        CompositePrimaryKeys::ActiveRecord::Relation.new : self
    end
            
          

Database partitioning

Tables still too big - millions of Contacts per cell

MySQL Hash partitioning by :account_id

mysql.com/products/enterprise/partitioning

Api GET: 500 contacts json

SLA: < 500ms

Actual: ~ 4s

DB: 12ms

Serialization

Skip your ORM

dockyard.com/generating-json-responses-with-postgresql

CAUSE: Too much stuff

string = params[:country]
country = COUNTRY_CODE_MAP.detect do |k,v|
  [k.downcase, v.downcase].include? string.downcase
end
  • Creates array INSIDE A LOOP
  • Downcase creates copies of strings
  • include? causes both variables to be downcased

SOLUTION: Do less work

string = params[:country]
country = COUNTRY_CODE_MAP.detect do |k,v|
  k.casecmp(string) == 0 || v.casecmp(string) == 0
end
  • No array
  • No downcased copy of strings
  • || instead of include?

Memoize!

def contact_ids
  @contact_ids ||= params[:ids].split(',')
end
  • First call stores result in an instance variable
  • Subsequent calls return the instance variable

CAUSE: Redundancy Across Requests

def subscriber_confirmation
  @subscriber_confirmation ||= expensive_lookup
end
  • Lookup is expensive (external service / long db query / cpu intensive)
  • Value changes rarely under known circumstances

SOLUTION: Cache With Redis

Usage
def subscriber_confirmation
  @subscriber_confirmation ||= Rails.cache.fetch(
    "subscriber_confirmation:#{Current.account}",
    :expires_in => LONG_TIME_IN_SECONDS
  ) { | key| expensive_lookup(key) }
end
  • Looks up value of key
  • Checks that key is not expired
  • If value is missing or expired:
    • Calls block with key
    • Updates cache with result

View/action caching

CAUSE: Exceptions as part of normal application flow

class SomeModel
  def funky_method
    unless complicated_action_succeeds
      raise FunkyError, 'Something Funky Failed'
    end
  end
end

class SomeController
  def some_action
    response = SomeModel.new(params).funky_method
    render status: 200, json: response
  rescue FunkyError => e
    render status: 400, json: {errors: 'Funky Failboat!'}
  end
end
  • Exception generates a backtrace
  • Exception is being raised manually

SOLUTION: Return Message not Exception

class SomeModel
  def funky_method
    unless complicated_action_succeeds
      { error: 'Yeah, that happens' }
    end
  end
end

class SomeController
  def some_action
    response = SomeModel.new(params).funky_method
    status = response[:error] ? 400 : 200
    render status: status, json: response
  end
end
  • Return value indicates error
  • No exception at controller == No Backtrace

Excessive Logging

Rails.logger.debug "This is expensive #{some_expensive_method}"
  • message is created even if it is below logging level
big_array.each do |x|
  Rails.logger.debug 'processing row'
  ...
end
  • logger is called n times with the same message

Log carefully

if Rails.logger.debug?
  Rails.logger.debug "This is expensive #{some_expensive_method}"
end
  • Check if you are logging debug first
Rails.logger.debug "processing #{big_array.size} rows"
big_array.each do |x|
  ...
end
  • logger is called once

Too Many AR Instances

def all_contact_ids
  Contact.all.map &:contact_id
end
  • Creates a lot of objects
  • Retrieves all columns, only uses contact_id

Use Pluck

def all_contact_ids
  Contact.all.pluck(:contact_id)
end
  • Creates a single Array
  • Retrieves only contact_ids

In Rails 3 you can only pluck one column, but Rails 4 allows multiple!

Destroying Records Is Slow

class Contact
  has_many :addresses, dependent: :destroy

  def self.remove_bad_contacts
    self.destroy_all(bad_contact: true)
  end
end
  • Instantiates each record
  • Calls destroy callbacks
  • Calls destroy on dependents (more instances!)

If you don't have callbacks that need to run, and you can handle the dependants yourself, this is a big waste of memory and CPU.

Use Delete Instead

class Contact
  has_many :addresses, dependent: :destroy

  scope :bad_contacts, -> { where(bad_contact: true) }

  def self.remove_bad_contacts
    Address.delete_all contact_id: self.bad_contacts.pluck(:contact_id)
    self.bad_contacts.delete_all
  end
end
  • Does not instantiate any AR records
  • Does not call destroy callbacks
  • Manually delete dependencies

Be careful not to skip important logic, especially on those depenents!
Consider a method on the dependent model that does this safely.

Reads for Validation

class Contact < ActiveRecord::Base
  validates :first_name,  uniqueness: {scope: :last_name}
end
  • Saving a record requires a read and a write

Let the Database do it

class Contact < ActiveRecord::Base
  around_save :check_uniqueness

  def check_uniqueness
    yield
  rescue ActiveRecord::RecordNotUnique => exception
    errors.add :base, "contact is not unique"
    raise ActiveRecord::RecordInvalid.new(self)
  end
end

# db migration
add_index :contacts, [:firstname, :lastname], :unique => true
  • Database index for the unique fields
  • Mutate the exception to RecordInvalid
  • RecordInvalid is handled as if valid? had failed

Inserting Multiple Records

  • Many trips to the database
  • Database connection churn
  • Validations and callbacks

Mass Insert

  • Handle Validation Up Front
  • Reconsider callbacks
  • Include a module that can build SQL by introspection

MEASURE FIRST

MEASURE OFTEN

Meet the SLAs

Dinshaw Gobhai | dgobhai@constantcontact.com

@tallfriend

github.com/dinshaw/meet-the-slas

Alex 'the beast' Berry

Andre 'the log grepper' Zelenkovas

Tom 'the meeting man' Beauvais