Big Data Is In Need of a Big Brain

One of the big wide-open opportunities for innovation in the 21st century is around “big data”. Every day, approximately 2 billion gigabytes of data is created worldwide, and it’s growing exponentially (doubling per-capita every 40 months). In other words, we are flooded with information. The challenge is to process that information in a timely intelligent way so that we may live longer happier more productive lives (assuming all those things are not contradictory).

I think the task at hand is similar to the one the human brain does on a regular basis: it takes in several gigs of sensory data a second, and uses only a tiny fraction of that to arrive at facts about the past, present, and future state of its surroundings. As the universal stream of data grows (beyond comprehension), the collective machine brain has to grow with it as it scrambles desperately to pick out the useful bits.

Companies like Google are leading the way in that effort. But of course since big data is flooding every crevice of our lives, every company battling it out for our dollar, will have to invest in some kind of “big data analytics”. There are lots of opportunities for incremental improvements, but there are even more opportunities for futurists and dreamers to write books about the inevitable rise of intelligent machines, who will surely be based on some mixture of neural networks and genetic programming 😉

The following is a good Google Tech Talk discussion on the current trends in big data:

Leave a Reply

Your email address will not be published. Required fields are marked *