|
June 18, 2015 2:18 PM
Posted By Peter Bentley
|
Time time ago I was asked by Dominic Burgess (a freelance graphic designer, composer, and more recently, comic actor) to help explain a little something about Moore's Law for a series of short comedy films he was producing for the Guardian website. I spoke on camera about it and I'll quote myself from my book Digitised on the topic below:
Gordon Moore and Robert Noyce founded a certain company called Intel in 1968. Their work and the work of many other pioneers around the world resulted in a transformation of electronics. While early integrated circuits only had a few hundred or thousand transistors, steady improvements in manufacturing techniques allowed more and more transistors to be placed on one chip. (Much of the early growth in the 1960s was driven by the American missile and Apollo Space programmes.)
The growth in complexity of integrated circuits led Gordon Moore to make a prediction in 1965 , where he noted that from the invention of the IC in 1958 to 1965, the number of transistors on a chip had doubled each year. He predicted that this trend would continue for at least another decade. He later revised this to say that the number of transistors on a chip would double every two years. In 1970 the Caltech professor Carver Mead coined the term “Moore’s Law” for this prediction.
Remarkably, the “law” appears to have held ever since. By the mid 1970s there were ten thousand transistors on a single chip. By 1986 we could fit more than one million transistors on one chip. By 2005 we could fit a billion transistors on a chip. Although there are frequent predictions that Moore’s Law will soon break down because the tiny size of transistors are now approaching the limits allowable by the laws of physics, so far the improvement of this remarkable technology continues unchecked.
Computer technology has always embraced the very latest in electronics, so the amazing improvements in silicon chips corresponded to an equally amazing improvement to computers from the 1960s. A colleague of Moore at Intel, David House, estimated that Moore’s Law meant that computer performance would double every eighteen months. He was nearly right – for many years, computer power has doubled about every 20 months. As Moore was to joke later: "House said 18 months, not me."
This is the truth behind Moore's Law, but somehow Moore frequently gets the credit for David House's prediction. In Dominic's comedy sketch his rather crazy character takes all kinds of liberties with the facts (and I think manages to chop off my fingers too). But hopefully the main ideas still manage to make it through!
You can watch his comedy video online here.
|
May 22, 2015 4:47 PM
Posted By Peter Bentley
|
Last month I was interviewed for a feature in New Scientist written by Sean O'Neill. It explores the idea that computers could simulate the universe and within that simulation, artificial life might arise. Would that make us gods of that universe? And do the inhabitants deserve our respect or should we treat them like just another piece of software? This is what Sean wrote:

|
January 14, 2015 12:47 PM
Posted By Peter Bentley
|
Artificial Intelligence has never been more popular, with many recent movies and books having fun with the ideas. Research in AI and Machine Learning has never been stronger, with more people creating more advances than ever before. We're now seeing more and more mainstream applications - look at Siri on your iPhone for an everyday example of start-of-the-art AI. Your credit card company uses basic machine learning to alert you of potential fraud. Your car might even have the ability to apply the brakes and help you avoid collision. There are new companies specialising in creating AI software - I am a consultant for one called Braintree Ltd which has the real aim of creating Strong AI in the future.
But there has also been another recent trend - the rise in AI scaremongering. Professors, entrepeneurs, and supposedly knowledgeable people who should know better are increasingly being reported as proclaiming that AI poses a real danger to the future of humanity. Whether these people have made these claims or not, they should know better than to let such claims stand in the press.
It's nonsense. Rubbish. Idiocy. It might even be downright damaging in the same way that the GM food or Stem Cell negative publicity caused real damage to research funding and progress. It's also not new. AI research is as old as computers and it has been through this several times in the past - silly claims and predictions, leading to a loss of confidence, leading to "dark ages" of AI research where no funding can be obtained.
The bottom line is that we try very very hard to make "intelligent" software. We get a few really neato results for very niche applications. We will continue to make really neato applications that process information better than we can, and soon we'll have lots of very helpful tools that make our lives easier and safer. But despite the science fiction visionaries and their silly predictions, we have little clue how to make real intelligence. We don't understand consciousness or emotions; we don't understand how and why brains are structured in the way they are; we don't understand so much that we simply cannot make an AI. Maybe with enough resources we could evolve one with a combination of genetic algorithms, developmental processes and neural networks. But we don't understand how to do this well enough yet, and we don't have the computational resources.
So I suggest - if you're worried about technologies that pose a real danger to humans, forget AI. Worry about the automobile. Worry about trains or aircraft. Worry about water processing facilities, power stations. Worry about the decay of societies caused by excessive TV or video game playing. Worry about people doing harm to themselves. Worrying about AI is no different from worrying about how teleportation or antigravity will destroy humanity. It's seriously not an issue, and won't be an issue for a long, long, long, long time.
These are some of the things I would say if given a bit more time. In the land of TV however I only get about 15 seconds, so here's what I did say on ITV news recently.
|
May 29, 2014 8:59 PM
Posted By Peter Bentley
|
There's a strange irony in the world of publishing. Science sells. (I know, because I write a fair amount of nonfiction - please check out my books!) Whether it's nonfiction science books like mine or magazines such as New Scientist or Wired, science and technology is massively popular. That means it attracts big audiences, and generally those of us who write science and tech articles and books do reasonably well. Certainly well enough to make a living.
Science fiction is also massively popular. Some of the most successful movies of all time have been science fiction; many hugely popular TV shows are science fiction. There are more science fiction authors and books on amazon then ever before. (I've also published a few science fiction short stories, in case you're interested!)
But here's the weird fact: today a successful writer of science nonfiction will get 20 to 50 times the money per word compared to a successful writer of science fiction. That means that very very few science fiction writers can make a living from their craft.
The ironies continue. Despite its name, science fiction publishing is not embracing the latest technologies well. Most methods of publishing (markets) seem stuck in the past. Most publishers are resistant to change, while being terrified of the threat from giants such as amazon who now enable authors to self-publish en mass. Many authors are now either self-publishing or feeling frustrated that their work is becoming lost amongst the tidal wave of - sometime mediocre - writing that is now available to readers. Many amazing science fiction stories remain undiscovered by readers - and by the movie producers as is evident by the sometimes embarrassingly poor or recycled storylines of recent movies.
I happen to be an avid science fiction reader. I may write nonfiction, but my bookshelves are largely filled with science fiction books. So it greatly frustrates me that the genre of science fiction is seemingly losing its way in the modern world. But I don't have the answers. I wish I did! In fact I mainly have a lot of questions. So from today we are launching a survey for science fiction writers to find out more. It's not clear if we can do anything to help, but what we can do is find out what everyone thinks. If you are a science fiction writer or you know someone who is, then please check out our survey below!

|
February 23, 2014 8:43 AM
Posted By Peter Bentley
|
Since I first created my app iStethoscope in 2008, things have rapidly changed in the world of medical technology. Much of it has to do with the proliferation of cheap consumer technology. Mobile phones, tablets and everything in between are now packed full of sensors, and manufacturers are starting to make many plug-in gadgets to help monitor every aspect of our health. The rumours of Apple's future ventures into watches and other wearable devices that can be used for fitness and health purposes will only increase this trend, I'm sure. On this topic I was recently interviewed by journalist Arnaud Devillard for a French monthly news magazine. As you might expect, it's all in French, but you can read the full set of articles online here:
http://www.atelier.net/sites/default/files/20140201-sciences-et-avenir.pdf
|
January 10, 2014 8:33 PM
Posted By Peter Bentley
|
After a huge amout of work by Soo Ling, our activity company Kazoova is now online and we're looking for expert organisers to join and provide their talents for their communities. We're very excited by it and to celebrate the new year we've created a video! Check it out here:

|
September 4, 2013 11:15 AM
Posted By Peter Bentley
|
Perhaps because my popular science books, such as the recent Digitized, I am
frequently asked for quotes, or explanations of, recent advances in computing. The article
released a couple of days ago on the CNN website is definitely more the latter. I was asked first
about the new Intel tri-gate transistor and then increasingly about quantum computing, and I
seem to have been quoted quite extensively in this one. It's a very high level piece as the
journalists did not feel it was appropriate to talk about details such as quantum entanglement.
You can read it below or click on the image to see the original.

|
August 16, 2013 5:58 PM
Posted By Peter Bentley
|
Some months ago a New Scientist writer with a sublime
name - Will Heaven - (and ex UCL researcher) came to chat to me about parallel computing. I waffled
at length, mentioning my systemic computer as I am often known to do. To my surprise, Will has just
got back in touch, telling me that he wrote a whole feature which is coming out this week in the mag.
I think it makes quite a nice read. Not often I'm quoted saying the word "piddling" as well...
it's such a long piece that I can't include all of it below, so instead click on the image to read the
pdf.

|
July 27, 2013 1:26 PM
Posted By Peter Bentley
|
I've been many things in my career to date. A scientist, science writer, author, science
communicator, public speaker, professor. I've also sometimes been a programmer, an app
developer, web designer, photographer, editor, and publisher. I've also been called quite a few
things. A maverick, polymath, and, well a few less kind things too. (Someone even threatened
to punch me at a recent conference because of my charming smile.) But in this recent article for
Discovery News in which I am quoted, I have a new title: Psychologist. Um... really? It's true
I've been working with quite a few of them recently, but this one is really a bit inaccurate!
You can read the article, which is about bad luck,
here
|
April 12, 2013 6:24 PM
Posted By Peter Bentley
|
I was commissioned a couple of months ago to write another piece for BBC Focus magazine.
They wanted a speculative article, discussing what the world might be like if the technology
existed to upload our memories into computers. I couldn't resist giving them a short story - not
what they asked for, but they liked it so much that it was published in the April edition of the
magazine. Here's a copy below if you want to read!
![]() 
|
|
|