A 1588x Render Speed Increase With Solr Caching on Rails to Solve All Your Performance Issues

23 August 2016 on . 6 minutes to read

Powerful Rails Report and View Generation

Solr Saves Your Bacon. Again.

I’ve recently covered setting up and storing calculated functions and attributes using Sunspot and Solr in your Rails application. Today, I’m continuing on that with an insanely fast way to keep your reports up to date. A roll your own partial caching on Rails with Sunspot and Solr was pretty damn fun to build. The best part? It only takes a few lines of Ruby to give expensive, large reports an unbelievable performance boost. Quick, cheap, easy and bleedingly fast performance sound too good to be true? I thought so before today, too.

The Issue

As described in my previous posts, a client had a large, slow report to generate. I used Solr to cache calculated values (and model attributes as well to save database hits) to greatly increase the speed. It was still too slow for my personal standards. I was able to drop down database and calculation time by an order of magnitude, but generating each partial still takes time when rendering a large collection. Total page generation (not rendering, rendering that much data required a different fix I’ll go into later) time took close to a full minute, which was actually quicker than the excel sheet that this dashboard replaced would load, but that’s still incredibly slow from a usability perspective.

###Some numbers, and the setup

Given a set of contracts, generating each one takes ~14 ms (this includes overhead and time between partial generation). I got this number from the following:

c = Contract.active.last
time = Benchmark.measure do
  1000.times {c.generate_partial}
end

<Benchmark::Tms:0x007fbcc9880eb8 @label="", @real=14.308556433999911, @cstime=0.0, @cutime=0.0, @stime=4.670000000000001, @utime=8.759999999999998, @total=13.43>

Interestingly, when the partials were rendered, in the console you would see a partial rendered in 4ms from almost every render. Even though the actual render is only 4ms per object, it looks like the overhead to set it all up was almost 3x what the actual render time was.

You may notice the c.generate_partial, I added on an instance method to Contract to handle partial generation for this benchmarking and the later solution. Using Render Anywhere. This will be available on all models, by default, in Rails 5. For now you have to use the gem like so:

require 'render_anywhere'

class Contract < ActiveRecord::Base
  include RenderAnywhere

  def generate_partial
    render self
  end
end

Now, if only I could reduce the redundancy of generating the partials each time a report was viewing but also be sure they’re correct. Sounds like a great time for a cache! Oh yeah; these partials also need to be prerendered so that way the first time you load the report it’s snappy…

The Solution

Queue the lightbulb. At the time, it seemed like a crazy thought. Why not add a stored attribute to the solr document for each contract? This checks all our boxes for requirements:

  • Partial is up to date whenever a contract is updated
  • Cache is ready to go even if the contracts report hasn’t been viewed yet
  • It’s fast

Given what I’ve already shown for the contract class, just add the following to the searchable block, and it’s ready to go:

searchable do
  string :contract_report_partial, stored: true do
    generate_partial
  end
end

The above ensures that the partial is not only cached in solr, but is up to date with all calculations and values needed. Now, how does this look on the view page? Given a solr @search from the controller, within the report view we have:

- @search.hits.each do {|hit|}
  = raw hit.stored(:contract_report_partial)

Note: This isn’t what’s actually being used on production. There’s something much quicker I’ll showcase in a future post. This is for the sake of brevity.

Performance

After reindexing the solr documents for our contracts, a search that pulls the first 1000 contracts and retrieves all the rendered partials gives us the following:

params = {}
search = Contract.solr_search do
  paginate(per_page: 1000)
end

time = Benchmark.measure do
  Admin::Reports::Contracts.new.contract_data(search) # This iterates over each search hit and returns an array composed of all the generated partials
end

#<Benchmark::Tms:0x007fbcc33f0c00 @label="", @real=0.009134497959166765, @cstime=0.0, @cutime=0.0, @stime=0.0, @utime=0.00999999999999801, @total=0.00999999999999801>

Yes, that’s only 9ms to retrieve 1000 partials that before was taking 14.3 seconds. Also, these results are always guaranteed correct. That increased the speed of getting the report data by a factor of… 1588x.

Memory Usage

For anyone wondering, building a table row <tr><td>attrs...</td>....</tr> with 36 attributes on it for each contract only took up 730 bytes per solr document. This added up to ~4.7MB of solr memory for indexing the data for our 6500 contracts. Barely any memory, and a crazy performance boost. Solr delivered big on this one.

Final Thoughts

Yes, Rails has the ability to do in memory fragment caching and this can be hooked up on Heroku easily with memcached. Additionally, you can multi-fetch partials to increase this even more. I’ll test these in the future, but here I was able to use in place addons for this application and with the addition of 8 lines of ruby take the 2nd most expensive operation in displaying this report down to a time of essentially zero. The most expensive operation time-wise? Having the DOM render within the browser. The fix for that is a future post.

This was entirely a way do something just because I could. The fact that it was quick, reliable, cheap and simple was the icing on the cake. Enjoy your development work and be creative.

Elasticsearch (Update: Feb 04, 2017)

Since I wrote this article, I’ve spent a lot of time using Elasticsearch to power efficient Rails searching and reporting. Solr was my first dive into Lucene powered searches, but Elasticsearch is what I’d currently recommend for anyone looking into this. Need a hand or have questions about Elasticsearch? Contact me to see what sort of crazy performance gains I can introduce to your application.


If you enjoy having free time and the peace of mind that a professional is on your side, then you’d love to have me work on your project.

Contact or view a list of available services to see how I’ll make your life better, easier and bring satisfaction back into you running your business.