The performance of the tools, in terms of both time spent and memory useage, is an everpresent focus when hundreds of thousands of keywords are added each day to a system that already makes tens of millions of bids. I've identified several areas where performance improvements needed to be made, including:
- reducing the memory useage of the bidding job by working around the need to load data upfront and ensuring that the data could be garbage collected when it was no longer needed
- rewrote job to sync ensure consistancy between our data and our partners to reduce the number of database calls; after rewrite, the job took less than an hour (where it had taken weeks)
- refactored main daily import job to be more reliable
- parallelized numerous api jobs
Whenever I work on an internal tool, my priority is always to get feedback from the end users. I like to ask questions about how people plan to use the thing I'm building, build something rough, and use that to continue the discussion of what the final product should look like. At SmarterTravel, I built several new tools and refactored dozens of old ones to incorporate ideas for new functionality or eliminate pain points for our internal users.
I fundamentally believe that fast access to data is the key to success of any part of any business. Engineers are busy and often "priorities" lie elsewhere, so the best data is data that can be accessed without help. At SmarterTravel, I was able to build a SqlServer Analysis Services Cube which was able to replace most of our canned reports as well as enable much more data exploration. I also helped analysts proactively expand their knowledge of SQL which enabled them to dig even deeper into the data.