I’ve been struggling with slow queries using golang ql embedded sql-database. It’s easy to use but complex queries become slow quite fast. It seems using indexes beyond basic cases is out of the question.
After trying everything I can think of to make a query faster (which runs over 5 minutes), I started to think the problem from a different angle. Accessing a file and a database with multiple tables with tens of thousands of rows isn’t the easiest way to try to group and sum data. Then it came to me, why don’t I load the data from the file to a memory database (which ql also supports)? It should be a lot faster.
open a file-based database:
db, err = sql.Open("ql", "./ql.db")
open an in-memory database:
mdb, err = sql.Open("ql-mem", "mem.db")
Easy enough, there is some overhead of course. You do have to initialize the in-memory database the same way as you did the file-based database … every time you start the app. And also you have get and save the data, and handle all updates after that when the app is running.
Reference
ql embedded database: https://github.com/cznic/ql