Isaac Asimov was right, as usual

In 1959, Isaac Asimov wrote this essay on creativity, postulating on the elements that fostered creativity in humans. As usual, he was right and I have a story from my PhD that somehow proves his statements.

The story proves the baseline argument from the essay: isolation is definitely necessary but it’s the informal discussions within small groups that allows the teams to nurture novel concepts and apply abstract ideas to other environments.

At the beginning of my PhD, when I was dwelling around algorithms to improve the location of resources in absurdly large distributed networks, I wasn’t really able to produce anything of value and the only good idea that I had (in theory) proved to be catastrophic. The work environment at the time wasn’t helping, since there were too many distractions. So, I decided to force some isolation to really analyse the problem and try to understand why my previous idea (that seemed so reasonable in theory) had such bad results.

I started thinking of nothing else but complex search algorithms in distributed networks. I visualised them in my head. I went to sleep with those in mind and sometimes even dreamed about them. And little by little, a new idea started to form. Not only that, but I finally realised why my previous idea would never work.

While developing the new algorithm (which proved not only to be better than the previous idea but also better than the current state of the art), I also came up with the idea of a new data structure that would optimise the way the algorithm would store and search data. After the implementation, when I tested the algorithm with and without that data structure, I was marvelled at my genius for creating such an elegant and efficient solution.

It was only when I presented the results to my advisor that he stated: “this data structure that you created here is nothing more than a more complex instance of a Trie“. “A Trie?”, I asked. “Yes, a Trie, a data structure that has existed for decades”, he replied.

So much for “marvelling in my genius”. But at least I was satisfied that I reached the same conclusion as some brilliant scientist several decades ago. But this shows that isolation  only works up to a point. Sharing your ideas with others is still necessary.

A few months later, my advisor suggested that the department should do some seminar sessions in which all the professors and researchers presented their work to foster  discussion and originate potential partnerships in our research areas. So, in the following weeks, we did exactly that: everyone would present their work to the rest of the department and at the end, the interested parties would discuss the issues further to help generate new ideas or form new partnerships.

By the time it was my turn, I had already implemented my new algorithm with the Trie data structure and was quite happy with the results. However, after presenting my work, when I was discussing the algorithm with a smaller group of colleagues (that have never seen my work before because they work in completely different areas), two of them suggested simple ideas to try and improve the performance of the network as a whole that it never occurred to me.

At first I was like: “How dare they? Thinking they can improve my algorithm which they have just learned about…” But even though that thought crossed my mind, I tried to implement their simple ideas. And surprise, surprise, I got an almost 30% increase in the performance of the network behaviour.

I was happy with this, of course, but what bothered me the most was the fact that I didn’t think of that before. And I realised that this was the by-product of isolation, again. While isolating myself to look at the problem I forgot to see the problem from another angle, with the mind of a person that is looking at it with a fresh pair of eyes. And these colleagues that were looking at my work for the first time had simple solutions for things that I didn’t even think of.

So, yes, Isaac Asimov was right in 1959. Creativity is the result of focused and isolated minds that are not afraid to, every once in a while, gather and discuss abstract ideas that contribute to the collective improvement of the whole.

Modern-day Twilight Zone

Imagine that the typical money-for-ransom is traded with a weird request like: “If you want to get your dear princess back, the prime-minister will have to have sexual intercourse with a pig on live television.” Will the prime-minister do it? Or more importantly, in a heavily socialized society where viral content dominates the people’s attention span, would they want him to do it and would they watch?

Or imagine that you live in a future where the most common way of making an income is to work on huge energy-production buildings where people pedal specialized stationary bikes to produce the energy that the rest of the world will consume. The alternative of leading this boring and tiresome life is to become a star in worldwide-broadcast reality shows that range from singing or pornography to physical abuse. Is everything better than the bike?

Or imagine instead that everyone has a brain implant that allows recording and reviewing every memory they have ever had and, by using a small external device, people can simply rewind and fast-forward to a particular memory and display it on a nearby TV for everyone to see. Now imagine you suspect your wife is cheating on you and you over-analyse every memory that you have of her with the guy you suspect she’s having the affair with. How long would it take you to go insane?

BlackMirrorTitleCard

(Image source)

These stories are the plots of the 3 episodes of the first season of BBC’s wonderful series Black Mirror, a techno-paranoia drama where each episode features a different story, a different cast and a different reality. In a modern-day Twilight Zone-like setting, each story brings us a potential future for our society that, albeit seeming a bit extremist, will definitely leave you thinking if this is really what you want for your future.

I definitely recommend everyone to see it. I dare say it’s mandatory for anyone with any kind of technology enthusiastic view of the society.

Are we close to simulate the human brain?

Back in 2007, US researchers have simulated half a virtual brain of a mouse on a supercomputer. Interestingly, Ray Kurzweil, in his 2005 book “The Singularity is Near“, accurately predicted the amount of computing power necessary for that scientific achievement.

What’s really interesting is the prediction that he made for 2013. Check this graphic:

Supercomputer power

(Image source)

2013 is portrayed as the year where the human race will have enough computing power, according to Kurzweil’s predictions, to simulate an entire virtual human brain.

I still believe that the challenge regarding simulating the human brain is in the software, not in the hardware[1], but this will sure be an interesting year for artificial intelligence research.

  1. There’s still too much about the inner works of the human brain that we don’t know []

Should we start worrying about this asteroid?

I know this is old news[1], but I guess it’s worth the consideration: asteroid 2011 AG5, an asteroid roughly the size of a football stadium, is thought to be in a (albeit improbable) collision course to Earth.

(image source)

Before you start panicking, let me explain: this asteroid is currently cruising through our Solar System and is orbiting our Sun pretty much the same way as our 8 planets[2] are. It will pass by our planet in 2023 but in a completely safe trajectory. The problem is when it is planned to pay us a visit again in 2040. According to scientists, there is a 1 in 625 chance that the asteroid will hit Earth.

Scientists will know more about the asteroid (and its trajectory) once it comes out of the other side of the Sun, which should be later this year. Future observations will probably reveal that there’s no reason to worry. But… what if? The chances are really slim, but after seeing what an (estimated) 15 meters asteroid did in Russia, I think we should consider what our plan is going to be.

Don’t you?

  1. In Internet Years™, it’s ancient really! []
  2. Sorry, Pluto! []