Another key feature of the future will be an increased importance in data management to enable the semantic web.
Databases are the key to the future of the web. Until now we have focused on the frontend of the web, developing RIAs (Rich Internet Applications) as part of the web 2.0 revolution. But the next generation of the web will be about semantic and context aware computing. To achieve this new generation of the web changes to database technology will be required.
Only the database appliances that are fully optimized for fast parallel processing will really enable the shift to semantic and context aware computing. However, a key limiting factor for the next stage in the evolution of the web – from web 2.0 to the semantic web – is the way our current relational databases work. These older style databases are optimized for transaction processing, which either ensures that a complete atomic transaction is completed or will reverse the entire transaction. The next generation web will require massively parallel database operations to support the semantic web.
David Wiseman from Sybase was introducing their new Analytic Appliance at the recent Gartner Symposium/ITxpo in Sydney. He described this database as a “highly optimized data warehouse analytics appliance.” This is a column based database that is optimized for high speed massively parallel access. It is this kind of approach that is going to enable the next generation semantic web. Kevin Kelly was talking about this new approach, using databases to power the next generation of web, recently at the Web 2.0 Summit in San Francisco, calling it the “operational Semantic Web, or World Wide Database, or Giant Global Graph, or Web 3.0”.
One thought on “Imagining Technology Futures – part 5”
The interesting thing with the scalable database requirement is that the transition from relational databases to other datastore techniques, e.g. Google AppEngine’s DataStore (on which I’ll be giving a talk at OSDC’s google hackathon), requires a mind shift for developers.This mindshift is neither easy, nor quick. Google apparently budgets three months for new hires to get used to MapReduce. And Google hires smart people.
Comments are closed.