Some thoughts on the “Internet of Things” or so-called “Web 3.0”
By Philippe GAUTIER on Monday 21 April 2008, 11:12 - Web 3.0 / Internet of Things / Internet des Objets - Permalink
''Recently, Vinton Cerf made an association between the raising concept of the Internet of Things (“I-o-T” is an abstract gathering several others such as “M2M”, “sensor, mesh, pervasive or ubiquitous networks”, “EPCGlobal standards”, etc.) and the appellation “WEB 3.0” (seen in an article : www.lemonde.fr, 5th april 2008). Instead of seeing it as a new IT marketing stuff, I’ve tried to sort out what need there would be to use such a word, moreover considering that most people still don’t know what Web 2.0 is…..''
Recently, Vinton Cerf made an association
between the raising concept of the Internet of Things (“I-o-T” is an abstract
gathering several others such as “M2M”, “sensor, mesh,
pervasive or ubiquitous networks”,
“EPCGlobal standards”, etc.) and the appellation
“WEB 3.0” (seen in an article : www.lemonde.fr,
5th april 2008).
Instead of seeing it as a new IT marketing stuff, I’ve tried to sort out
what need there would be to use such a word, moreover considering that most
people still don’t know what Web 2.0
Actually, this Web 2.0 can be seen as an evolution of the Web (1.0 ?) that
strengthen notions of interoperability, collaboration, “mutualisation” of
resources (Web services, dynamic pages, etc.). Its features are mainly based on
networking (social or professional), syndication, publication (Blogs, Wikis,
etc.), space or service sharing (including virtual worlds controlled by humans
such as “Second Life”).
Taking this definition as an assumption, I’ve dug in my own experience in
implementation and R&D about the ongoing EPCGlobal standards (including
RFID) and found some interesting feedback.
First of all, the Internet of Things emphasizes an already quite well known
dimension of the Web : complexity.
For those who know a bit about it (old generations of software developers or
organization engineers, physicians, mathematicians, etc.), “complexity” is
everything but linearity, determinism or process
; all of this generally being included in the “traditional” and analytical
top-down approaches that generations of managers,
CIOs, quality engineers or consultants (ITIL, ISO, etc.) used to deal
The Web 2.0 initiated this phenomenon (increasing complexity, reducing determinism) at a global scale
(worldwide network) and we have experienced accordingly a merge of what had
been considered in the “old” times as independent spheres : business,
private life, citizenship and consumerism, etc.
In other words, it has started to dissolve the existing frontiers, as well
as our former visions or models of organization.
However, managing complexity often needs a part of __fuzzy logic__ or incertitude (e.g.: artificial intelligence) : in the Web
2.0, this is still mainly supported by the “underlying framework” of people
(i.e. : human brains) behind each node on the worldwide network - in the Web
2.0, human brains are still the major “component” for dealing with
In the Internet of Things, objects will be duly identified (in most cases it
will be either serialized or unique) and will be capable to act as autonomous
media, either directly for those having active
IDs with embedded software code, or indirectly throughout the network and
This could potentially spread, over the existing Internet, new independent
actors, capable to react and to influence the whole system (refers to the
“butterfly effect”, well known since H.
Poincaré and further work such as the chaos
Whilst apprehending such a phenomenon, we could consider that a major
evolution behind the Internet of Things would then be to reverse the current
paradigm or approach in the design of IT architectures : a “bottom-up” approach would be preferable to a “top-down” one in order to address efficiently subsidiary
levels (the closest layers to the physical object and its lifecycle). Pushing
forward this approach, the next big stake would be to put a
part of fuzzy Logic at the object level itself (or at least in its virtual
representation) to give it the ability to fully act as an “independent entity”
or actor on the Internet and to give it capabilities of adaptation, reaction,
initiative or situated analysis.
The adding of a huge part of fuzzy logic in the global Internet would
therefore no longer be provided by human beings (as usual) but by objects (or
their virtual representation), Web services, documents, Naming spaces, etc.
This would have to be considered as a major breakthrough and a revolution for the Internet.
Calling this “Web 3.0” has therefore a tangible meaning, moreover since this
Internet will be able, to some extend and for the first time, to live
“independently” from its human conceivers.
To make it real, the big challenge would be now to be able to provide and
industrialize adapted solutions, which frameworks could be based on fuzzy
logic, decisional or
systemic approaches, parallelism capabilities and chaotic models.
As long as we will go on using “process based” or analytical approaches, we won’t be able to
pass the threshold between the existing Web 2.0 and the upcoming Web 3.0…. and
objects won’t be able to subscribe by themselves to Facebook to interoperate !....
Copyright Philippe GAUTIER - April