The Importance of Cross-layer Considerations in a Standardized WSN Protocol Stack Aiming for IoT: The Internet of Things (Ubiquity symposium)
by Bogdan Pavkovic, Marko Batic, Nikola Tomasevic
The Internet of Things (IoT) envisages expanding the current Internet with a huge number of intelligent communicating devices. Wireless sensor networks (WSNs) integrating IoT will rely on a set of the open standards striving to offer scalability and reliability in a variety of operating scenarios and conditions. Standardized protocols will tackle some of the major WSN challenges like energy efficiency, intrinsic impairments of low-power wireless medium, and self-organization. After more then a decade of tremendous standardization efforts, we can finally witness an integral IP-based WSN standardized protocol stack for IoT. Nevertheless, the current state of standards has redundancy issues and can benefit from further improvements. We would like to highlight some of the cross-layer aspects that need to be considered to bring further improvements to the standardized WSN protocol stack for the IoT.
Evolution and Disruption in Network Processing for the Internet of Things: The Internet of Things (Ubiquity symposium)
by Lorenzo Di Gregorio
Between prophecies of revolutions and inertiae of legacies, the Internet of Things (IoT) has already become the brand under which light processing units communicate over complex networks. Network processing is caught between demands for computation, raised by the growing complexity of the networks, and limitations imposed by performance of lightweight devices on processing. In this contribution the potential for disruptive changes against the scaling of existing technologies is discussed, specifically three main aspects of the IoT that impact network protocols and their processing: the reversal of the client/server architectures, the scavenging of spectral bands, and the federation of Internet gateways.
Fog Computing Distributing Data and Intelligence for Resiliency and Scale Necessary for IoT: The Internet of Things (Ubiquity symposium)
by Charles C. Byers, Patrick Wetterwald
The Internet of Everything (IoE) is more than a $19 trillion opportunity over 10 years. Fifty billions of devices will be connected to various networks in 2020. This is bringing new technical challenges in all domains and specifically in the data processing. Distributed intelligence is one of the key technological answers. We call it "fog computing." Fog can provide intelligent connection of people, processes, data, and things in hierarchical Internet of Things networks. By supplementing the cloud and providing intermediate layers of computation, networking, and storage, fog nodes can optimize IoE deployments---greatly enhancing latency, bandwidth, reliability, security, and overall IoE network performance. The article will analyze the architecture and main design choices of this technology.
Internet Programmable IoT: On the role of APIs in IoT: The Internet of Things (Ubiquity symposium)
by Maja Vukovic
With everything and everyone accessible as a virtual resource on the web, novel applications that are created out of existing capabilities will continue to emerge. This article discusses the role and challenges for APIs in transforming the IoT capabilities.
A Case for Interoperable IoT Sensor Data and Meta-data Formats: The Internet of Things (Ubiquity symposium)
by Milan Milenkovic
While much attention has been focused on building sensing systems and backing cloud infrastructure in the Internet of things/Web of things (IoT/WoT) community, enabling third-party applications and services that can operate across domains and across devices has not been given much consideration. The challenge for the community is to devise standards and practices that enable integration of data from sensors across devices, users, and domains to enable new types of applications and services that facilitate much more comprehensive understanding and quantitative insights into the world around us.
Standards for Tomorrow: The Internet of Things (Ubiquity symposium)
by Dejan Milojicic, Paul Nikolich, Barry Leiba
Over the decades, standards have been critical for defining how to interconnect computer and networking devices across different vendors so they can seamlessly work together. Standards have been critical, not only in networking and computer interfaces, but also at the operating system and systems software level. There are many examples, such as IEEE 802, POSIX, IETF, and W3C. There was always the question of the right time to standardize (not too early and not too late), and the time to complete a standardization project always seemed too long, but inevitable. However, the contemporary industry seems to be more dynamic and evolving than it has ever been, demanding more agile processes. Open source processes and software defined (networks, storage, data centers, etc.) offer alternatives to standards. In this article we attempt to envision the future role of standards, and how they will complement and enhance alternative choices toward the same goal. We first summarize traditional standards, then discuss alternatives and a couple of use cases, and conclude with some future directions and opportunities for standardization.
W3C Plans for Developing Standards for Open Markets of Services for the IoT: The Internet of Things (Ubiquity symposium)
by Dave Raggett
October 2015The Internet of Things (IoT) is being held back by divergent approaches that result in data silos, high costs, investment risks and reduced market opportunities. To realize the potential and unleash the network effect, W3C is focusing on the role of Web technologies for a platform of platforms as a basis for services spanning IoT platforms from microcontrollers to cloud-based server farms. Shared semantics are essential for discovery, interoperability, scaling and layering on top of existing protocols and platforms. For this purpose, metadata can be classified into: things, security, and communications, where things are considered to be virtual representations (software objects) for physical or abstract entities. Thing descriptions are modeled in terms of W3C's resource description framework (RDF). This includes the semantics for what kind of thing it is, and the data models for its events, properties and actions. The underlying protocols are free to use whichever communication patterns are appropriate to the context according to the constraints described by the given metadata. W3C is exploring the use of lightweight representations of metadata that are easy to author and process, even on resource constrained devices. The aim is to evolve the web from a web of pages to a "Web of Things."
Discovery in the Internet of Things: The Internet of Things (Ubiquity symposium)
by Arkady Zaslavsky, Prem Prakash Jayaraman
October 2015How to find a "thing" in the Internet of Things (IoT) haystack? The answer to this question will be the key challenge that IoT users and developers are facing now and will face in the future. Current models for IoT are focused heavily on developing vertical solutions limited by hardware and software platforms and support. With the estimated explosion of IoT in the coming years as predicted by Cisco, IBM and Gartner, there is a need to rethink how IoT can deliver value to the end-user. A paradigm shift is required in the underlying fundamentals of current IoT developments to enable a wider notion of "thing" discovery as well as discovery of relevant data and context on the IoT. Discovery will allow users to build IoT apps, services and applications using "smart things" without the need for a priori knowledge of things. In this article, we look at the current state of IoT and argue for paradigm shift addressing why and how discovery can make a significant impact for the future of IoT and moreover, become a necessary component for IoT success story.
The Third Wave: The internet of things: The Internet of Things (Ubiquity symposium)
by Kemal A. Delic
October 2015We are witnessing these days the rise of always-on, connected devices, systems and environments. This might well represent the basis for the emerging digital economy. This symposium brings in diverse points of views trying to synthesize the most likely future of Internet of Things (IoT).
On Quantum Computing: An interview with David Penkler
by Kemal A. Delic
September 2015In recent months, announcements on the progress toward harnessing quantum computing have solicited divers and sometimes strong reactions and opinions from academia and industry. Some say quantum computing is impossible, while others point to actual machines-raising the question as to whether they really are quantum computers. In this interview, Dave Penkler---an HP fellow whose primary interests are in cloud and data-center scale operating systems and networks---shares his view on the present and future of quantum computing. Penkler has 40 years of experience with computer hardware and software and has always had a keen interest in their evolution as enabled by the advances in science and technology.
Automated bug fixing: an interview with Westley Weimer, Department of Computer Science, University of Virginia and Martin Monperrus, University of Lille and INRIA, Lille, France
by Walter Tichy
Fixing bugs manually is expensive, time-consuming, and unpleasant. How about getting the computer to fix the bugs, automatically? Automatically repairing them might save us from misunderstandings, lack of time, carelessness, or plain old laziness. But this brings into question some fundamental limitations. Yet in the past 10 years, a number of young scientists have taken on automatic bug fixing. This interview discusses the approximations currently in use and how far they can take us.