Experilous

I mentioned in my previous post that I needed to do some serious performance optimizations on my Lua code, and indicated in an edit that after a day’s worth of effort, I managed to speed things up by roughly a factor of 13. I have some more work to do, but in the hopes that the steps I’ve taken so far are helpful to anyone else going through a similar process, allow me to describe some of them before they slip my mind. (more…)

After fighting with some silly bugs, and fiddling with map generation and the data structures used to represent the map, I finally reached the point last week where I could begin to assess performance.  It was immediately obvious that the pathfinding algorithm was the bane of my program’s execution.  Something absurd like 98% of the running time was dedicated to pathfinding.  The subsequent operation of following the paths found and adding to congestion was trivial in comparison.

To more effectively focus on just pathfinding, I created a new project that was designed to run two different pathfinding algorithms over the exact same list of origin/destination pairs, and collect both timing data and algorithmic data on each.  One of the pathfinding algorithms would generally be a baseline that I had already tested, and the second would involve some form of experimental tweaking.

The tweaks came in two flavors, which I’ll call technical and algorithmic.  The technical tweaks were not intended to change behavior significantly, but were only for improving the performance of the operations already being done.  The algorithmic tweaks on the other hand were intended to make the A* algorithm behave very differently, with the intention of making it intelligently consider fewer nodes along the way to finding an acceptable path.  (Under normal operation for the types of maps I’m dealing with, a basic A* algorithm will consider a large number of nodes that it eventually decides aren’t part of the final best path.  It would be nice to just ignore those nodes altogether, but to do that perfectly, the algorithm would need to see into the future.)

By the end, I was able to calculate around 12,000 paths per second on a grid-like map of highways, major roads, and minor roads that was roughly 100 square miles (260 square kilometers)  in size.  Each path on average was about 7 miles long, and took roughly 7.5 minutes to travel.  Larger maps shouldn’t  really affect the speed nearly as much as longer paths would.  As long as most paths stay around 7 miles in length, the same speeds should remain achievable on larger maps.  (The bigger difficulty would largely come from more paths needing to be computed, as there would probably be more people moving around.)

Below is a (perhaps excessively detailed) chronology of tweaks that I went through over the past week.  As a summary, though, I’ll note the following takeaways: (more…)