Well day two here at PgEast has drawn to a close and it was another
very informative day.
Today I concentrated on the more common tasks of a Pg DBA so I attended three
talks (four if you count mine) that where rather heavy on the technical side of being a Pg DBA
Keven Kempter drew me back again with his excellent talk on Backup and recovery methods
this time giving some very good advice on how to use and abuse of pg_Dump_all and
PG_restore. He also touched on three different recipes PITR on ProstgreSQL and gave some handy
advice on when and why to use it.
I also caught another Mongo talk this time by Steve Francia it was on the application of Mongo
in a real world web retail store. He presented a very convincing argument for the NoSQL side of things in
the retail realm namely that RDBMS works great when you have but a few similar products
such as books, CDs and movies but what if you are a retailer who sells Jeans, Watches, Fresh fruit as well.
Mongo allows for a completely flexible schema and his fits the diverse retail model well.
Another good point he made is in the archiving pf transactions. Taking a product return
as an example one has to keep a recoded of all the details of the sale and you have to have some sort of mechanism to reconcile data points such as the price which might be stale they the time of the return. In
the Mongo world one just keeps the original sales record. A rather elegant solution.
I rounded out the day with two technical Pg talks the first by Magnus Hagander gave a very informative talk on the differing approaches to the ‘caching problem of web applications’ with PostgreSQL. By leveraging PostgreSQL notification system one can easily build a very robust and scalable cache with another open commercial product called Varnish.
I rounded out the day with an talk on Migrating from MySQL to PostgreQSL given by Paul Gross. This was an interesting case study of his experience where he was constrained by a 0 down time requirement. His solution was to use the ORM he was familiar with ‘ActiveRecord’ and use that to solve the many problems with data conversion he was encountering using just Ruby on its own. He used an iterative approach where he ran a script over and at each pass gathering up any changes in the originating MySQL into the PostgreSQL until they where exactly the same. This was successful with only a 30 second blackout time during the switchover to the new DB.
Well that is about it for day.
Interested in working with John? Schedule a tech call.