Tuesday, July 21, 2009

Exponential Technology Growth

Much has been written about the accelerating rate of growth of technological advancement. As humans we are used to seeing and understanding linear growth but exponential growth is harder to comprehend as are it's impacts.

In his essay the Law of Accelerating Returns, Ray Kurzweil has laid out the laws for accelerating returns:

  • Evolution applies positive feedback in that the more capable methods resulting from one stage of evolutionary progress are used to create the next stage. As a result, the
  • Rate of progress of an evolutionary process increases exponentially over time. Over time, the "order" of the information embedded in the evolutionary process (i.e., the measure of how well the information fits a purpose, which in evolution is survival) increases.
  • A correlate of the above observation is that the "returns" of an evolutionary process (e.g., the speed, cost-effectiveness, or overall "power" of a process) increase exponentially over time.
  • In another positive feedback loop, as a particular evolutionary process (e.g., computation) becomes more effective (e.g., cost effective), greater resources are deployed toward the further progress of that process. This results in a second level of exponential growth (i.e., the rate of exponential growth itself grows exponentially).
  • Biological evolution is one such evolutionary process.
  • Technological evolution is another such evolutionary process. Indeed, the emergence of the first technology creating species resulted in the new evolutionary process of technology. Therefore, technological evolution is an outgrowth of--and a continuation of--biological evolution.
  • A specific paradigm (a method or approach to solving a problem, e.g., shrinking transistors on an integrated circuit as an approach to making more powerful computers) provides exponential growth until the method exhausts its potential. When this happens, a paradigm shift (i.e., a fundamental change in the approach) occurs, which enables exponential growth to continue.
Kurzweil has developed his hypothesis by creating a number of interesting graps liek this which map out the the exponential rate of technology and specifically computing power:

While there is continued debate on whether computers will ever reach the level of intelligence (which in itself is a difficult concept to define) as biological entities, the implications on our lives in unmistakable.

One aspect of his hypothesis was particularly thought provoking. Because of the exponential growth of technology, that the 21st century will have as much technological advancement as the previous 20,000 years. Many pessimists about the future often discount this acceleration and compression of technological advancement in shorter and shorter periods of time. To put it another way, we will see a century's worth of advancement in just 25 years.

Friday, June 05, 2009

Genomics - Upgrade the software that runs the hardware

Fascinating talk by Barry Schuler on Genomics. What struck me most was the part on how you can change the software or write a plug-in and if its compatible, you can run it on the same organic hardware.
 Not only does it do that; if you took the genome -- that synthetic genome -- and you plugged it into a different critter, like yeast, you now turn that yeast into Mycoplasma. It's, sort of, like booting up a PC with a Mac OS software. Well, actually, you could do it the other way. So, you know, by being able to write a genome and plug it into an organism, the software, if you will, changes the hardware. And this is extremely profound.

Thursday, April 09, 2009

The most basic communication

Fascinating talk by Bonnie Bassler on the communication systems used by bacteria. I am fascinated by how so called simple organisms can be so become so power using nothing by effective communication techniques. Communication and cooperation seem to be way forward as it took bacteria billions of years to figure out. Web 2.0 and Facebook are just modern multi-cellular equivalents of a billion+ year old technique. Now if only we could figure what it was all for?

Sunday, June 22, 2008

Wicked Problems

Wicked Problems have incomplete, contradictory, and changing requirements; and solutions to them are often difficult to recognize as such because of complex interdependencies. Rittel and Webber stated that while attempting to solve a wicked problem, the solution of one of its aspects may reveal or create other, even more complex problems.

Addressing or trying to define the elements of a wicked problem could be one of the potential uses of the semantic framework.

Specific examples of wicked Problems / social messes include issues such as global climate change, healthcare in the United States and elsewhere, the AIDS epidemic and perhaps other emerging diseases, pandemic influenza, international drug trafficking, homeland security, and nuclear energy and waste.

Rittel and Webber's (1973) formulation of wicked problems[2] specifies ten characteristics, perhaps best considered in the context of social policy planning. According to Ritchey (2007)[3], the ten characteristics are:

  1. There is no definitive formulation of a wicked problem.
  2. Wicked problems have no stopping rule.
  3. Solutions to wicked problems are not true-or-false, but better or worse.
  4. There is no immediate and no ultimate test of a solution to a wicked problem.
  5. Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial-and-error, every attempt counts significantly.
  6. Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan.
  7. Every wicked problem is essentially unique.
  8. Every wicked problem can be considered to be a symptom of another problem.
  9. The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem's resolution.
  10. The planner has no right to be wrong (planners are liable for the consequences of the actions they generate).
The semantic framework will not in itself solve a wicked problem. It will instead define each of the elements of a wicked problem. How those definitions are developed, and how to get people to agree to the definitions will all continue to remain problems and would still require the use of the frameworks like "Issues Based Information System" (IBIS). A Semantic framework could form the underlying framework for using an IBIS-type system.

I will continue to explore what software tools exist to document wicked problems and whether a basic semantic framework can be developed using a combination of existing tools and frameworks.

Sunday, June 15, 2008

Just circuits?

V.S. Ramachandran is a mesmerizing speaker, able to concretely and simply describe the most complicated inner workings of the brain. His investigations into phantom limb pain, synesthesia and other brain disorders allow him to explore (and begin to answer) the most basic philosophical questions about the nature of self and human consciousness.
This is a fascinating talk (as all TED Talks are). Leads to the question are we all just wired up like circuit boards? Can we rewire and short-circuit our hardware (brains) and more important can we hack the software that runs on the hardware?



Ramachandran is the director of the Center for Brain and Cognition at the University of California, San Diego, and an adjunct professor at the Salk Institute. He is the author of Phantoms in the Brain, the basis for a Nova special, and A Brief Tour of Human Consciousness; his next book, due out in January 2008, is called The Man with the Phantom Twin: Adventures in the Neuroscience of the Human Brain

Information Hunter Gatherers

Information gathering is only the latest iteration of the natural human tendency to be a hunter-gatherer. There is a strong compulsion to check in on new information. There is almost a primeval sense of anticipation as you fire up your email client and watch expectantly to see what new information your snare has caught. Sometimes the fare is meager, an email with the latest meme or the skateboarding pug from You Tube. At other times it is something more substantial that feeds the intellect and satisfies the age old urge to hunt and gather. Is the next logical progression to settle down, organize and cultivate? Has that already begun with consumer-created content, wikis and the rise of the prosumer?

Sunday, June 01, 2008

Knowledge Representation and the Semantic Web

The Semantic Web has been attracting considerable attention the last few years. From the point of view of Knowledge Representation, the Semantic Web affords opportunities for both research and application. However, several aspects of the Semantic Web, as it has been envisioned, cause problems from the Knowledge Representation viewpoint. Overcoming some of these problems has resulted in a more formal basis for the Semantic Web and an increase in expressive power in Semantic Web languages. Other of these problems still remain and need a new vision of the Semantic Web from a Knowledge Representation viewpoint

Thursday, July 05, 2007

Mashup Creators

A mashup is a website or application that combines content from more than one source into an integrated experience. They were once something that geeks pulled together typically overlaying information onto a map or pulling together two information sources in some interesting way. In the past you needed tob e a pretty good programmer to create and publish a mashup.

Microsoft, Google and Yahoo have all nearly simultaneously released mashup creators which are accessible to the the average users.
The approach that all of these tools take is that its the content that is important and that's what you should be caring about, leave all the code to us. This is great news for the average user who doesn't want to mes with JavaScript but still wants to see his/her data in a useful way.

The ease with which you can now create a mashup and how its slowly gaining acceptance in the non-programmer community is similar to the way in which websites were developed. It used to be that you needed to be an experienced web-developer to create your own website but today almost anybody can create their own using WYSIWYG tools.

The Semantic Web

Over the years, there has been a lot of interest generated around the concept of the ‘Semantic Web’. At its core, the semantic web is a series of design principles which aim to assign meaning to electronic content which can not only be understood by humans but also by machines.

Tim Berners-Lee originally expressed the vision of the semantic web as follows:
“I have a dream for the Web [in which computers] become capable of analyzing all the data on the Web – the content, links, and transactions between people and computers. A ‘Semantic Web’, which should make this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines. The ‘intelligent agents’ people have touted for ages will finally materialize.

Though Tim Berners-Lee’s vision sounds fantastic and almost impossible, the practical applications of his vision can be applied using existing technology and a growing body of emerging standards. His long-term vision of machines as ‘intelligent-agents’ has more mundane applications in the present. The present suite of semantic technologies is not intended to replace human input and intelligence but merely to allow machines to have a slightly better understanding of the content of the materials they are dealing with.

An example of this is a picture of an apple. Humans can easily spot a picture of an apple and identify it as such. For computers this is a very complex task. The traditional approach to teach a computer to recognize a picture of an apple is to label the picture an apple to create metadata that identifies the content of the picture as an apple. But this approach is lacking as the computer does know any additional information about the apple. It knows that the picture is of an apple but it does not really understand what an apple is. To teach a computer what an apple is, ontology needs to be created which describes the properties and attributes of an apple. This is a picture of an apple which belongs to the class of fruits. It is red in color and is suitable for human consumption. Teaching a computer what an apple is by providing this additional information can add immense value to the service that the computer can now provide. The computer now knows that the apple is a fruit, its color and that humans can eat it. As more ontology data is created (by humans), the computer slowly begins to understand the world and provide value added services.

The concept of creating an ontology is not new and has been around for centuries. But now technology and information technology are at a point where it is possible for computers to consume ontology and apply them t content.