acm - an acm publication

Innovators

  • Professional writing tips and techniques

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    Good writing is not easy, but these 12 tips and techniques makes things easier.

  • How to do a naked presentation

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    A "naked" presentation solely relies upon good storytelling. Learn how to enrapture an audience without the use of visual aids.

  • Students tackle the routing problem for in-traffic emissions tests

    Vehicle emissions tests used to be done entirely in the laboratory. However, certain car manufacturers cheated on those tests. In response, the European Union introduced emissions tests in real traffic. To make such tests meaningful, they must be performed on routes that meet certain criteria, such as the difference in elevation between start and end points and the proportion of urban and country roads. Finding suitable routes is a complex search problem. Undergraduate students from Karlsruhe Institute of Technology, Germany, developed the first fully automatic solution for finding such routes. In this interview, they share how they did it.

  • A conversation with Marianna Obrist: using touch, taste and smell in virtual and augmented experiences

    In this series of interviews with innovation leaders, Ubiquity Associate Editor and software engineer, Dr. Bushra Anjum sits down with Marianna Obrist, who is exploring augmented and virtual reality within the context of HCI. Obrist discusses multi-sensory interactions that go beyond sight and sound, as well as her work that explores the role of human senses in the design of future technologies.

  • If you write it better, you will say it better

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    Preparing a good text for reading and preparing a good text for speaking are often considered to be unrelated activities. This is incorrect. A good text for reading and a good text for speaking are distinct, but they are not alien. They are complementary.

  • How to avoid death by powerpoint: Steve Jobs' secret weapon

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    Bite the bullet and learn how to organize your presentation slides to get the greatest effect.

  • Cybersecurity is not very important

    There is a rising tide of security breaches. There is an even faster rising tide of hysteria over the ostensible reason for these breaches, namely the deficient state of our information infrastructure. Yet the world is doing remarkably well overall, and has not suffered any of the oft-threatened giant digital catastrophes. This continuing general progress of society suggests that cyber security is not very important. Adaptations to cyberspace of techniques that worked to protect the traditional physical world have been the main means of mitigating the problems that occurred. This "chewing gum and baling wire" approach is likely to continue to be the basic method of handling problems that arise, and to provide adequate levels of security.

  • Computing a landing spot on Mars: an interview with Victor Pankratius

    The purpose of the Mars rover is in its name---to rove, explore, study Martian geology, look for signs of water, look for signs of life (past or present), etc. However, achieving these and other objectives requires putting the rover down on a suitable landing site, i.e. a site suitable for searching for the desired information and safe to land and function without hindrance or breaking down.

    The data for making these decisions comes from prior Mars missions. Selecting a suitable landing site is a complex process typically taking several years. Researchers at MIT's Kavli Institute for Astrophysics and Space Research prototyped a new software that can help NASA mission planners to more rapidly and reliably find landing sites, potentially reducing the total time required to weeks. In this interview, Victor Pankratius, leader of the research team, shares some insight into the project.

  • An interview with Jason Ernst: incentives of a decentralized networking infrastructure

    In this series of interviews with innovation leaders, Ubiquity Associate Editor and software engineer, Dr. Bushra Anjum, sits down with Jason Ernst, CTO of RightMesh, to discuss how his company is using mobile mesh networks to decentralize existing network infrastructure in areas where it doesn't exist or is too expensive to maintain--effectively putting the control of data in the hands of the people.

  • Why visual aids need to be less visual

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    Public speaking is not only about communicating your ideas orally, but also visually. Too many presentations are undermined by poorly chosen slides. An outstanding presentation is one that addresses two fundamental objectives, with the end goal of leaving a lasting impression on the audience.

  • How to instantaneously improve your speaking voice

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    Although we spend much more time speaking than we do writing, the fact remains that most people speak very poorly. Phil Yaffe provides some tips on how to purposely redesign your articulation.

  • An interview with Indrajit Roy: toward self-correcting systems

    Indrajit Roy is a staff engineer at Google. He is currently working on peta-scale distributed databases. Previously, he was a principal researcher at HP Labs where he led the development of Distributed R, an open source HP product that brings the benefits of parallelism to data scientists. Roy received his Ph.D. in computer science from UT Austin. He is also an inaugural member of the ACM Future of Computing Academy.

  • Silence is golden, especially when you need to say something important

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    How well you speak will always be an indicator of how well you know the subject at hand. And while nerves can often lead novice speakers to resort to distracting sounds and placeholders, a second or two of silence will help focus you as well as your audience. In this installment, Philip Yaffe reminds us that silence is golden.

  • An interview with Lauren Maffeo: understanding the risks of machine learning bias

    Lauren Maffeo is a research analyst who joined the global technology sector in 2012. She started her career as a freelance journalist covering tech news for The Next Web and The Guardian. She has also worked with CEOs of pre-seed to profitable SaaS startups on media strategy. Lauren joined GetApp, a Gartner company, as a content editor in 2016. She covers the impact of emerging tech like AI on small and midsize business owners.

    Lauren has been cited by sources including Forbes, Fox Business, DevOps Digest, The Atlantic, and Inc.com. In 2017, Lauren was named to The Drum's 50 Under 30 list of women worth watching in digital. She holds an M.Sc. from The London School of Economics and a certificate in Artificial Intelligence: Implications for Business Strategy from MIT's Sloan School of Management.

  • An interview with Pamela Wisniewski: making the online world safer for our youth

    Dr. Pamela Wisniewski is an assistant professor at the University of Central Florida's Department of Computer Science and an inaugural member of the ACM Future Computing Academy. As a human-computer interaction researcher, she studies privacy as a means to protect people, but more importantly, as a social mechanism to enrich online interactions that people share with others. She is particularly interested in the interplay between social media, privacy, and online safety for adolescents. Being a survivor of childhood sexual abuse, she is committed to protecting at-risk youth from online sexual predation risks, as well as empowering vulnerable youth online, so that they can garner the resources and support they need to overcome adversity and succeed in life.

  • How to untie your tongue

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    Communication is not only the written word. In this installment, Philip Yaffe shares tips and exercises that will help improve your skills in both writing and speaking.

  • An interview with Bushra Anjum: learning to be a generalist is valuable to your career

    Dr. Bushra Anjum is a senior editor for ACM's web-based magazine Ubiquity. Her research background is in performance evaluation and queuing theory. She is also a trained data scientist, having worked extensively with predictive analytics. Anjum, a Fulbright Scholar, has previously held academic positions in the U.S. and Pakistan, and is a keen enthusiast of promoting diversity in the STEM fields. She is a mentor at Rewriting the Code, GlobalTechWomen, ReigningIt, Empowering Leadership Alliance, LeanIn.org, Computing Beyond the Double Bind's mentoring network, and others. Dr. Anjum can be contacted via Twitter @DrBushraAnjum.

  • The 7% rule revisited

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    In this installment, Philip Yaffe debunks the myth of verbal versus non-verbal communication.

  • Banishing the fear of public speaking

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    In this installment, Philip Yaffe explores how speak to a crowd.

  • Big data: big data or big brother? that is the question now.

    This ACM Ubiquity Symposium presented some of the current thinking about big data developments across four topical dimensions: social, technological, application, and educational. While 10 articles can hardly touch the expanse of the field, we have sought to cover the most important issues and provide useful insights for the curious reader. More than two dozen authors from academia and industry provided shared their points of view, their current focus of interest and their outlines of future research. Big digital data has changed and will change the world in many ways. It will bring some big benefits in the future, but combined with big AI and big IoT devices creates several big challenges. These must be carefully addressed and properly resolved for the future benefit of humanity.

  • Artificial intelligence in politics: an interview with Sven Körner and Mathias Landhäußer of thingsTHINKING

    Natural language processing, an area of artificial intelligence (AI), has attained remarkable successes. Digital assistants such as Siri and Alexa respond to spoken commands, and understand several languages. Google has demonstrated a machine can call up a restaurant and make a reservation in a manner that is indistinguishable from a human. Automated translation services are used around the world in over a hundred languages. This interview discusses a new and surprising application of language processing in politics. Though the AI software analyzes texts in German, it could be adapted to any language. The underlying technology has wider applications in text analysis, including legal tech, contracting, and others. Here is a summary.

  • Big Data: Business, Technology, Education, and Science: Big Data (Ubiquity symposium)

    Transforming the latent value of big data into real value requires the great human intelligence and application of human-data scientists. Data scientists are expected to have a wide range of technical skills alongside being passionate self-directed people who are able to work easily with others and deliver high quality outputs under pressure. There are hundreds of university, commercial, and online courses in data science and related topics. Apart from people with breadth and depth of knowledge and experience in data science, we identify a new educational path to train "bridge persons" who combine knowledge of an organization's business with sufficient knowledge and understanding of data science to "bridge" between non-technical people in the business with highly skilled data scientists who add value to the business. The increasing proliferation of big data and the great advances made in data science do not herald in an era where all problems can be solved by deep learning and artificial intelligence. Although data science opens up many commercial and social opportunities, data science must complement other science in the search for new theory and methods to understand and manage our complex world.

  • Corporate Security is a Big Data Problem: Big Data (Ubiquity symposium)

    In modern times, we have seen a major shift toward hybrid cloud architectures, where corporations operate in a large, highly extended eco-system. Thus, the traditional enterprise security perimeter is disappearing and evolving into the concept of security intelligence where the volume, velocity/rate, and variety of data have dramatically changed. Today, to cope with the fast-changing security landscape, we need to be able to transform huge data lakes via security analytics and big data technologies into effective security intelligence presented through a security "cockpit" to achieve a better corporate security and compliance level, support sound risk management and informed decision making. We present a high-level architecture for efficient security intelligence and the concept of a security cockpit as a point of control for the corporate security and compliance state. Therefore, we could conclude nowadays corporate security can be perceived as a big-data problem.

  • How to make dull information exciting

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    In this installment, Philip Yaffe explains how to give the reader what they want.

  • P2P networks are inherently unstable

    The P2P technology underlying file-sharing systems like Gnutella and distributed autonomous organizations like blockchain are inherently unstable because of self-organizing processes akin to Gause's competitive exclusion principle, and preferential attachment. To maintain an egalitarian P2P organization it is necessary to conserve the original network's entropy, defined as the random structure of the network and actions among peers.

  • When Good Machine Learning Leads to Bad Security: Big Data (Ubiquity symposium)

    While machine learning has proven to be promising in several application domains, our understanding of its behavior and limitations is still in its nascent stages. One such domain is that of cybersecurity, where machine learning models are replacing traditional rule based systems, owing to their ability to generalize and deal with large scale attacks which are not seen before. However, the naive transfer of machine learning principles to the domain of security needs to be taken with caution. Machine learning was not designed with security in mind and as such is prone to adversarial manipulation and reverse engineering. While most data based learning models rely on a static assumption of the world, the security landscape is one that is especially dynamic, with an ongoing never ending arms race between the system designer and the attackers. Any solution designed for such a domain needs to take into account an active adversary and needs to evolve over time, in the face of emerging threats. We term this as the "Dynamic Adversarial Mining" problem, and this paper provides motivation and foundation for this new interdisciplinary area of research, at the crossroads of machine learning, cybersecurity, and streaming data mining.

  • First write like you speak, then write like you write

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

    In this installment, Philip Yaffe introduces a two-step plan to create well-written text that will not only impress the reader, but also engage the reader to digest and comprehend new ideas or concepts with ease.

  • Developing an Open Source 'Big Data' Cognitive Computing Platform: Big Data (Ubiquity symposium)

    The ability to leverage diverse data types requires a robust and dynamic approach to systems design. The needs of a data scientist are as varied as the questions being explored. Compute systems have focused on the management and analysis of structured data as the driving force of analytics in business. As open source platforms have evolved, the ability to apply compute to unstructured information has exposed an array of platforms and tools available to the business and technical community. We have developed a platform that meets the needs of the analytics user requirements of both structured and unstructured data. This analytics workbench is based on acquisition, transformation, and analysis using open source tools such as Nutch, Tika, Elastic, Python, PostgreSQL, and Django to implement a cognitive compute environment that can handle widely diverse data, and can leverage the ever-expanding capabilities of infrastructure in order to provide intelligence augmentation.

  • High Performance Synthetic Information Environments
    An integrating architecture in the age of pervasive data and computing: Big Data (Ubiquity symposium)

    The complexities of social and technological policy domains, such as the economy, the environment, and public health, present challenges that require a new approach to modeling and decision-making. The information required for effective policy and decision making in these complex domains is massive in scale, fine-grained in resolution, and distributed over many data sources. Thus, one of the key challenges in building systems to support policy informatics is information integration. Synthetic information environments (SIEs) present a methodological and technological solution that goes beyond the traditional approaches of systems theory, agent-based simulation, and model federation. An SIE is a multi-theory, multi-actor, multi-perspective system that supports continual data uptake, state assessment, decision analysis, and action assignment based on large-scale high-performance computing infrastructures. An SIE allows rapid course-of-action analysis to bound variances in outcomes of policy interventions, which in turn allows the short time-scale planning required in response to emergencies such as epidemic outbreaks.

  • Technology and Business Challenges of Big Data in the Digital Economy: Big Data (Ubiquity symposium)

    The early digital economy during the dot-com days of internet commerce successfully faced its first big data challenges of click-stream analysis with map-reduce technology. Since then the digital economy has been becoming much more pervasive. As the digital economy evolves, looking to benefit from its burgeoning big data assets, an important technical-business challenge is emerging: How to acquire, store, access, and exploit the data at a cost that is lower than the incremental revenue or GDP that its exploitation generates. Especially now that efficiency increases, which lasted for 50 years thanks to improvements in semiconductor manufacturing, is slowing and coming to an end.

  • Big Data for Social Science Research: Big Data (Ubiquity symposium)

    Academic studies exploiting novel data sources are scarce. Typically, data is generated by commercial businesses or government organizations with no mandate and little motivation to share their assets with academic partners---partial exceptions include social messaging data and some sources of open data. The mobilization of citizen sensors at a massive scale has allowed for the development of impressive infrastructures. However, data availability is driving applications---problems are prioritized because data is available rather than because they are inherently important or interesting. The U.K. is addressing this through investments by the Economic and Social Research Council in its Big Data Network. A group of Administrative Data Research Centres are tasked with improving access to data sets in central government, while a group of Business and Local Government Centres are tasked with improving access to commercial and regional sources. This initiative is described. It is illustrated by examples from health care, transport, and infrastructure. In all of these cases, the integration of data is a key consideration. For social science problems relevant to policy or academic studies, it is unlikely all the answers will be found in a single novel data source, but rather a combination of sources is required. Through such synthesis great leaps are possible by exploiting models that have been constructed and refined over extended periods of time e.g., microsimulation, spatial interaction models, agents, discrete choice, and input-output models. Although interesting and valuable new methods are appearing, any suggestion that a new box of magic tricks labeled "Big Data Analytics" that sits easily on top of massive new datasets can radically and instantly transform our long-term understanding of society is naïve and dangerous. Furthermore, the privacy and confidentiality of personal data is a great concern to both the individuals concerned and the data owners.

  • Big Data and the Attention Economy: Big Data (Ubiquity symposium)

    While attention has always been prized above money, few people have had the means to attract it to themselves. But the new digital economy has provided everyone with a loudspeaker; thus efforts at getting noticed have rapidly escalated in global society. The attention economy focuses on the mechanisms that mediate the allocation of this scarce entity. Social networks and big data play a role in determining what is noticed and acted upon.

  • Big Data, Digitization, and Social Change: Big Data (Ubiquity symposium)

    We use the term "big data" with the understanding that the real game changer is the connection and digitization of everything. Every portfolio is affected: finance, transport, housing, food, environment, industry, health, welfare, defense, education, science, and more. The authors in this symposium will focus on a few of these areas to exemplify the main ideas and issues.

  • How to say what you mean and mean what you say

    Each "Communication Corner" essay is self-contained; this week learn how to construct truly effective sentences. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

  • Art Scott and Michael Frank on energy-efficient computing

    Clock speeds of computing chips have leveled off dramatically since 2005, and putting more cores in systems on a chip (SoC) has produced more heat, adding a new ceiling to further advances. Leading-edge researchers, like Mike Frank, and dedicated technologists with a wealth of experience, like Art Scott, represent a new vanguard of the leap-forward beyond Dennard scaling and Landauer's limit. Art looks for ways to reduce energy consumption and Mike looks for ways to "architect" future chips according to principles of reversibility. Is the future in reversible, adiabatic computing and simpler architectures using posit arithmetic? My guests think so.

  • Why writing short sentences may be short-changing your reader

    Each "Communication Corner" essay is self-contained; however, they build on each other. For best results, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay.

  • Mixing computation with people: an interview with Marianne Winslett

    In this interview, we learn about five fascinating subjects: security in manufacturing, negotiating trust in the web, updating logical databases, differential privacy, and scientific computing (including its security issues). This is a confluence that has, at its roots, the thorny problems that arise when you mix computation with people. Some beautiful technical results, many originated by Marianne Winslett, now address those challenges, but some surprises crop up along the way.

  • How to improve your writing by standing on your head

    Newspapers provide the best examples of clear, concise writing you can find anywhere. Learning how journalists work their magic can help you write better, and it all begins with the "inverted pyramid."

  • Cybersecurity skeptics now embracing formal methods: an interview with Gernot Heiser and Jim Morris

    There is new hope for those who despair securing computer systems from external hackers. The recent DARPA HACMS project demonstrated conclusively that "certain pathways for attackers have all been shut down in a way that's mathematically proven to be unhackable for those pathways." Continuing research at DARPA and IARPA will eventually shut down all the pathways, and the external hackers will be out of business permanently.

  • The three acid tests of persuasive writing

    Each Communication Corner essay is self-contained; however, they build on each other. For best results, if you have not already done so, before reading this essay and doing the exercise, go to the first essay "How an Ugly Duckling Became a Swan," then read each succeeding essay sequentially.

  • How an Ugly Duckling Became a Swan

    The thrust of the Communication Corner is to offer step-by-step advice to help you become a better writer and speaker. This first essay explains how Phillip Yaffe went from being a very poor writer and speaker to being a recognizably good one, almost despite himself.

  • Unums 2.0: An Interview with John L. Gustafson

    In an earlier interview (April 2016), Ubiquity spoke with John Gustafson about the unum, a new format for floating point numbers. The unique property of unums is that they always know how many digits of accuracy they have. Now Gustafson has come up with yet another format that, like the unum 1.0, always knows how accurate it is. But it also allows an almost arbitrary mapping of bit patterns to the reals. In doing so, it paves the way for custom number systems that squeeze the maximum accuracy out of a given number of bits. This new format could have prime applications in deep learning, big data, and exascale computing.

  • Rethinking Randomness: An interview with Jeff Buzen, Part II

    In Part 1, Jeff Buzen discussed the basic principles of his new approach to randomness, which is the topic of his book Rethinking Randomness. He continues here with a more detailed discussion of models that have been used successfully to predict the performance of systems ranging from early time sharing computers to modern web servers.

    Peter J. Denning
    Editor in Chief

  • Rethinking Randomness: An interview with Jeff Buzen, Part I

    For more than 40 years, Jeffrey Buzen has been a leader in performance prediction of computer systems and networks. His first major contribution was an algorithm, known now as Buzen's Algorithm, that calculated the throughput and response time of any practical network of servers in a few seconds. Prior algorithms were useless because they would have taken months or years for the same calculations. Buzen's breakthrough opened a new industry of companies providing performance evaluation services, and laid scientific foundations for designing systems that meet performance objectives. Along the way, he became troubled by the fact that the real systems he was evaluating seriously violated his model's assumptions, and yet the faulty models predicted throughput to within 5 percent of the true value and response time to within 25 percent. He began puzzling over this anomaly and invented a new framework for building computer performance models, which he called operational analysis. Operational analysis produced the same formulas, but with assumptions that hold in most systems. As he continued to understand this puzzle, he formulated a more complete theory of randomness, which he calls observational stochastics, and he wrote a book Rethinking Randomness laying out his new theory. We talked with Jeff Buzen about his work.

    Peter J. Denning
    Editor in Chief

  • Changing the Game: Dr. Dave Schrader on sports analytics

    Dave Schrader, known to his friends as Dr. Dave, worked for 24 years in advanced development and marketing at Teradata, a major data warehouse vendor. He actively gives talks on business analytics, and since retiring has spent time exploring the field of sports analytics. In this interview, Schrader discusses how analytics is playing a significant role in professional sports--from Major League Soccer to the NBA.

  • The Future of Technology and Jobs: An interview with Dr. R.A. Mashelkar

    The following interview with Dr. Raghunath Anant Mashelkar is on the prospects of how will technology change the face of employment in the future? What will the jobs of the future look like? What skills are needed to prepare students and researchers for employment in the digital age? As our world is getting digitized day-by-day, technology influences how students communicate, learn, work and interact with society as a whole more than any generation before. Ultimately, students nowadays have to compete with a more globalized, mobile work force, and rapid technological advancement.

  • The End of (Numeric) Error: An interview with John L. Gustafson

    Crunching numbers was the prime task of early computers. The common element of these early computers is they all used integer arithmetic. John Gustafson, one of the foremost experts in scientific computing, has proposed a new number format that provides more accurate answers than standard floats, yet saves space and energy. The new format might well revolutionize the way we do numerical calculations.

  • The Rise of Computational Biology: An interview with Prof. Thomas Lengauer

    In this wide-ranging interview, we will hear from a pioneer in computational biology on where the field stands and on where it is going. The topics stretch from gene sequencing and protein structure prediction, all the way to personalized medicine and cell regulation. We'll find out how bioinformatics uses a data-driven approach and why personal drugs may become affordable. We'll even discuss whether we will be able to download our brains into computers and live forever.

  • Internet of Things in Energy Efficiency: The Internet of Things (Ubiquity symposium)

    This paper aims to provide the view of what means IoT (Internet of Things) in energy efficiency applications, of its technical and business impacts, of its opportunities and risks for the different market players. It is concluded by the author's long term vision about the use of IoT in energy efficiency applications.

  • On Resilience of IoT Systems: The Internet of Things (Ubiquity symposium)

    At the very high level of abstraction, the Internet of Things (IoT) can be modeled as the hyper-scale, hyper-complex cyber-physical system. Study of resilience of IoT systems is the first step towards engineering of the future IoT eco-systems. Exploration of this domain is highly promising avenue for many aspiring Ph.D. and M.Sc. students.

  • Ensuring Trust and Security in the Industrial IoT: The Internet of Things (Ubiquity symposium)

    Industrial Internet of Things (IOT) is a distributed network of smart sensors that enables precise control and monitoring of complex processes over arbitrary distances. The great advantage of the industrial IoT is counterbalanced by a security weakness. The insertion of a smart device capable of extracting protected data or malicious actions can infect the whole network with relative ease. Thus it becomes imperative to discover whether or not new devices have the right capabilities and compatibilities with other sensors. This article presents a zero knowledge protocol that achieves precisely that objective while keeping the sensor data private.

  • Using Redundancy to Detect Security Anomalies: Towards IoT security attack detectors: The Internet of Things (Ubiquity symposium)

    Cyber-attacks and breaches are often detected too late to avoid damage. While "classical" reactive cyber defenses usually work only if we have some prior knowledge about the attack methods and "allowable" patterns, properly constructed redundancy-based anomaly detectors can be more robust and often able to detect even zero day attacks. They are a step toward an oracle that uses knowable behavior of a healthy system to identify abnormalities. In the world of Internet of Things (IoT), security, and anomalous behavior of sensors and other IoT components, will be orders of magnitude more difficult unless we make those elements security aware from the start. In this article we examine the ability of redundancy-based anomaly detectors to recognize some high-risk and difficult to detect attacks on web servers---a likely management interface for many IoT stand-alone elements. In real life, it has taken long, a number of years in some cases, to identify some of the vulnerabilities and related attacks. We discuss practical relevance of the approach in the context of providing high-assurance Web-services that may belong to autonomous IoT applications and devices.

  • The Importance of Cross-layer Considerations in a Standardized WSN Protocol Stack Aiming for IoT: The Internet of Things (Ubiquity symposium)

    The Internet of Things (IoT) envisages expanding the current Internet with a huge number of intelligent communicating devices. Wireless sensor networks (WSNs) integrating IoT will rely on a set of the open standards striving to offer scalability and reliability in a variety of operating scenarios and conditions. Standardized protocols will tackle some of the major WSN challenges like energy efficiency, intrinsic impairments of low-power wireless medium, and self-organization. After more then a decade of tremendous standardization efforts, we can finally witness an integral IP-based WSN standardized protocol stack for IoT. Nevertheless, the current state of standards has redundancy issues and can benefit from further improvements. We would like to highlight some of the cross-layer aspects that need to be considered to bring further improvements to the standardized WSN protocol stack for the IoT.

  • Evolution and Disruption in Network Processing for the Internet of Things: The Internet of Things (Ubiquity symposium)

    Between prophecies of revolutions and inertiae of legacies, the Internet of Things (IoT) has already become the brand under which light processing units communicate over complex networks. Network processing is caught between demands for computation, raised by the growing complexity of the networks, and limitations imposed by performance of lightweight devices on processing. In this contribution the potential for disruptive changes against the scaling of existing technologies is discussed, specifically three main aspects of the IoT that impact network protocols and their processing: the reversal of the client/server architectures, the scavenging of spectral bands, and the federation of Internet gateways.

  • Fog Computing Distributing Data and Intelligence for Resiliency and Scale Necessary for IoT: The Internet of Things (Ubiquity symposium)

    The Internet of Everything (IoE) is more than a $19 trillion opportunity over 10 years. Fifty billions of devices will be connected to various networks in 2020. This is bringing new technical challenges in all domains and specifically in the data processing. Distributed intelligence is one of the key technological answers. We call it "fog computing." Fog can provide intelligent connection of people, processes, data, and things in hierarchical Internet of Things networks. By supplementing the cloud and providing intermediate layers of computation, networking, and storage, fog nodes can optimize IoE deployments---greatly enhancing latency, bandwidth, reliability, security, and overall IoE network performance. The article will analyze the architecture and main design choices of this technology.

  • A Case for Interoperable IoT Sensor Data and Meta-data Formats: The Internet of Things (Ubiquity symposium)

    While much attention has been focused on building sensing systems and backing cloud infrastructure in the Internet of things/Web of things (IoT/WoT) community, enabling third-party applications and services that can operate across domains and across devices has not been given much consideration. The challenge for the community is to devise standards and practices that enable integration of data from sensors across devices, users, and domains to enable new types of applications and services that facilitate much more comprehensive understanding and quantitative insights into the world around us.

  • Standards for Tomorrow: The Internet of Things (Ubiquity symposium)

    Over the decades, standards have been critical for defining how to interconnect computer and networking devices across different vendors so they can seamlessly work together. Standards have been critical, not only in networking and computer interfaces, but also at the operating system and systems software level. There are many examples, such as IEEE 802, POSIX, IETF, and W3C. There was always the question of the right time to standardize (not too early and not too late), and the time to complete a standardization project always seemed too long, but inevitable. However, the contemporary industry seems to be more dynamic and evolving than it has ever been, demanding more agile processes. Open source processes and software defined (networks, storage, data centers, etc.) offer alternatives to standards. In this article we attempt to envision the future role of standards, and how they will complement and enhance alternative choices toward the same goal. We first summarize traditional standards, then discuss alternatives and a couple of use cases, and conclude with some future directions and opportunities for standardization.

  • W3C Plans for Developing Standards for Open Markets of Services for the IoT: The Internet of Things (Ubiquity symposium)
    The Internet of Things (IoT) is being held back by divergent approaches that result in data silos, high costs, investment risks and reduced market opportunities. To realize the potential and unleash the network effect, W3C is focusing on the role of Web technologies for a platform of platforms as a basis for services spanning IoT platforms from microcontrollers to cloud-based server farms. Shared semantics are essential for discovery, interoperability, scaling and layering on top of existing protocols and platforms. For this purpose, metadata can be classified into: things, security, and communications, where things are considered to be virtual representations (software objects) for physical or abstract entities. Thing descriptions are modeled in terms of W3C's resource description framework (RDF). This includes the semantics for what kind of thing it is, and the data models for its events, properties and actions. The underlying protocols are free to use whichever communication patterns are appropriate to the context according to the constraints described by the given metadata. W3C is exploring the use of lightweight representations of metadata that are easy to author and process, even on resource constrained devices. The aim is to evolve the web from a web of pages to a "Web of Things."
  • Discovery in the Internet of Things: The Internet of Things (Ubiquity symposium)
    How to find a "thing" in the Internet of Things (IoT) haystack? The answer to this question will be the key challenge that IoT users and developers are facing now and will face in the future. Current models for IoT are focused heavily on developing vertical solutions limited by hardware and software platforms and support. With the estimated explosion of IoT in the coming years as predicted by Cisco, IBM and Gartner, there is a need to rethink how IoT can deliver value to the end-user. A paradigm shift is required in the underlying fundamentals of current IoT developments to enable a wider notion of "thing" discovery as well as discovery of relevant data and context on the IoT. Discovery will allow users to build IoT apps, services and applications using "smart things" without the need for a priori knowledge of things. In this article, we look at the current state of IoT and argue for paradigm shift addressing why and how discovery can make a significant impact for the future of IoT and moreover, become a necessary component for IoT success story.
  • On Quantum Computing: An interview with David Penkler
    In recent months, announcements on the progress toward harnessing quantum computing have solicited divers and sometimes strong reactions and opinions from academia and industry. Some say quantum computing is impossible, while others point to actual machines-raising the question as to whether they really are quantum computers. In this interview, Dave Penkler---an HP fellow whose primary interests are in cloud and data-center scale operating systems and networks---shares his view on the present and future of quantum computing. Penkler has 40 years of experience with computer hardware and software and has always had a keen interest in their evolution as enabled by the advances in science and technology.
  • What About an Unintelligent Singularity?: The technological singularity (Ubiquity symposium)

    For years we humans have worried about plagues, asteroids, earthquakes, eruptions, fires, floods, famines, wars, genocides, and other uncontrollable events that could wipe away our civilization. In the modern age, with so much depending on computing and communications, we have added computers to our list of potential threats. Could we perish from the increasing intelligence of computers? Denning thinks that is less of a threat than the apparently mundane march of automated bureaucracies. He also asserts that none of the possible negative outcomes is a forgone conclusion because humans teaming with machines are far more intelligent than either one alone.

  • Computers versus Humanity: Do we compete?: The technological singularity (Ubiquity symposium)

    Liah Greenfeld and Mark Simes have long worked together, integrating the perspectives of two very different disciplinary traditions: cultural history/historical sociology and human neuroscience. The combination of their areas of expertise in the empirical investigation of mental disorders, which severely affect intelligence---among other things---has led them to certain conclusions that may throw a special light on the question of this symposium: Will computers outcompete us all?

  • Exponential Technology and The Singularity: The technological singularity (Ubiquity symposium)

    The Priesthood of the Singularity posits a fast approaching prospect of machines overtaking human abilities (Ray Kurzweil's The Singularity is Near, Viking Press, 2006) on the basis of the exponential rate of electronic integration---memory and processing power. In fact, they directly correlate the growth of computing technology with that of machine intelligence as if the two were connected in some simple-to-understand and predictable way. Here we present a different view based upon the fundamentals of intelligence and a more likely relationship. We conclude that machine intelligence is growing in a logarithmic (or at best linear fashion) rather than the assumed exponential rate.

  • Human Enhancement--The way ahead: The technological singularity (Ubiquity symposium)

    In this paper a look is taken at artificial intelligence and the ways it can be brought about, either by means of a computer or through biological growth. Ways of linking the two methods are also discussed, particularly the possibilities of linking human and artificial brains together. In this regard practical experiments are referred to in which human enhancement can be achieved though linking with artificial intelligence.

  • The Singularity and the State of the Art in Artificial Intelligence: The technological singularity (Ubiquity symposium)

    The state of the art in automating basic cognitive tasks, including vision and natural language understanding, is far below human abilities. Real-world reasoning, which is an unavoidable part of many advanced forms of computer vision and natural language understanding, is particularly difficult---suggesting the advent of computers with superhuman general intelligence is not imminent. The possibility of attaining a singularity by computers that lack these abilities is discussed briefly.

  • The Future of Synchronization on Multicores: The multicore transformation (Ubiquity symposium)
    Synchronization bugs such as data races and deadlocks make every programmer cringe traditional locks only provide a partial solution, while high-contention locks can easily degrade performance. Maurice Herlihy proposes replacing locks with transactions. He discusses adapting the well-established concept of data base transactions to multicore systems and shared main memory.
  • The MOOC and the Genre Moment: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
    In order to determine (and shape) the long-term impact of MOOCs, we must consider not only cognitive and technological factors but also cultural ones, such as the goals of education and the cultural processes that mediate the diffusion of a new teaching modality. This paper examines the implicit cultural assumptions in the "MOOCs and Technology to Advance Learning and Learning Research Symposium" and proposes an alternative theory of diffusion to Clayton Christensen's disruptive innovation model as an illustration of the complexity that these assumptions hide.
  • The Multicore Transformation Closing Statement: The multicore transformation (Ubiquity symposium)
    Multicore CPUs and GPUs have brought parallel computation within reach of any programmer. How can we put the performance potential of these machines to good use? The contributors of the symposium suggest a number of approaches, among them algorithm engineering, parallel programming languages, compilers that target both SIMD and MIMD architectures, automatic detection and repair of data races, transactional memory, automated performance tuning, and automatic parallelizers. The transition from sequential to parallel computing is now perhaps at the half-way point. Parallel programming will eventually become routine, because advances in hardware, software, and programming tools are simplifying the problems of designing and implementing parallel computations.
  • Making Effective Use of Multicore Systems A software perspective: The multicore transformation (Ubiquity symposium)
    Multicore processors dominate the commercial marketplace, with the consequence that almost all computers are now parallel computers. To take maximum advantage of multicore chips, applications and systems should take advantage of that parallelism. As of today, a small fraction of applications do. To improve that situation and to capitalize fully on the power of multicore systems, we need to adopt programming models, parallel algorithms, and programming languages that are appropriate for the multicore world, and to integrate these ideas and tools into the courses that educate the next generation of computer scientists.
  • MOOCs: Symptom, Not Cause of Disruption: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
    Is the MOOCs phenomenon a disruptive innovation or a transient bubble? It may be partly both. Broadcasting lectures and opening up courses via MOOCs by itself poses little change of the academic status quo. But academia is part of a broader academic-bureaucratic complex that provided a core framework for industrial-age institutions. The academic-bureaucratic complex rests on the premise that knowledge and talent must be scarce. Presumed scarcity justifies filtering access to information, to diplomas, and to jobs. But a wave of post-industrial technical, economic, and social innovations is making knowledge and talent rapidly more abundant and access more "open." This mega-trend is driving the academic-bureaucratic complex toward bankruptcy. It is being replaced by new, radically different arrangements of learning and work. The embrace of MOOCs is a symptom, not a cause of academia's obsolescence.
  • GPUs: High-performance Accelerators for Parallel Applications: The multicore transformation (Ubiquity symposium)
    Early graphical processing units (GPUs) were designed as high compute density, fixed-function processors ideally crafted to the needs of computer graphics workloads. Today, GPUs are becoming truly first-class computing elements on par with CPUs. Programming GPUs as self-sufficient general-purpose processors is not only hypothetically desirable, but feasible and efficient in practice, opening new opportunities for integration of GPUs in complex software systems.
  • Offering Verified Credentials in Massive Open Online Courses: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
    Massive open online courses (MOOCs) enable the delivery of high-quality educational experiences to large groups of students. Coursera, one of the largest MOOC providers, developed a program to provide students with verified credentials as a record of their MOOC performance. Such credentials help students convey achievements in MOOCs to future employers and academic programs. This article outlines the process and biometrics Coursera uses to establish and verify student identity during a course. We additionally present data that suggest verified certificate programs help increase student success rates in courses.
  • Data-driven Learner Modeling to Understand and Improve Online Learning: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
    Advanced educational technologies are developing rapidly and online MOOC courses are becoming more prevalent, creating an enthusiasm for the seemingly limitless data-driven possibilities to affect advances in learning and enhance the learning experience. For these possibilities to unfold, the expertise and collaboration of many specialists will be necessary to improve data collection, to foster the development of better predictive models, and to assure models are interpretable and actionable. The big data collected from MOOCs needs to be bigger, not in its height (number of students) but in its width more meta-data and information on learners' cognitive and self-regulatory states needs to be collected in addition to correctness and completion rates. This more detailed articulation will help open up the black box approach to machine learning models where prediction is the primary goal. Instead, a data-driven learner model approach uses fine grain data that is conceived and developed from cognitive principles to build explanatory models with practical implications to improve student learning.
  • The Multicore Transformation Opening Statement: The multicore transformation (Ubiquity symposium)
    Chips with multiple processors, called multicore chips, have caused a resurgence of interest in parallel computing. Multicores are now available in servers, PCs, laptops, embedded systems, and mobile devices. Because multiprocessors could be mass-produced for the same cost as uniprocessors, parallel programming is no longer reserved for a small elite of programmers such as operating system developers, database system designers, and supercomputer users. Thanks to multicore chips, everyone's computer is a parallel machine. Parallel computing has become ubiquitous. In this symposium, seven authors examine what it means for computing to enter the parallel age.
  • Assessment in Digital At-scale Learning Environments: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
    Assessment in traditional courses has been limited to either instructor grading, or problems that lend themselves well to relatively simple automation, such as multiple-choice bubble exams. Progress in educational technology, combined with economies of scale, allows us to radically increase both the depth and the accuracy of our measurements of what students learn. Increasingly, we can give rapid, individualized feedback for a wide range of problems, including engineering design problems and free-form text answers, as well as provide rich analytics that can be used to improve both teaching and learning. Data science and integration of data from disparate sources allows for increasingly inexpensive and accurate micro-assessments, such as those of open-ended textual responses, as well as estimation of higher-level skills that lead to long-term student success.
  • Ubiquity symposium: The science in computer science: natural computation
    In this twelfth piece of the Ubiquity symposium discussing science in computer science, Erol Gelenbe reviews computation in natural systems, focusing mainly on biology and citing examples of the computation that is inherent in chemistry, natural selection, gene regulatory networks, and neuronal systems. This article originally appeared as part of the "What is Computation" symposium.
  • Interview with Mark Guzdial, Georgia Institute of Technology: computing as creation

    Mark Guzdial is a Professor in the School of Interactive Computing at Georgia Institute of Technology (Georgia Tech). His research focuses on the intersection of computing and education, from the role of computing in facilitating education to how we educate about computing. In this interview with him, he discusses how we teach computing and to whom, especially his contention that a contextualized approach is a powerful tool to teach everyone about computing.

  • An interview with David Alderson: in search of the real network science

    There has been an explosion of interest in mathematical models of large networks, leading to numerous research papers and books. The National Research Council carried out a study evaluating the emergence of a new area called "network science," which could provide the mathematics and experimental methods for characterizing, predicting, and designing networks. David Alderson has become a leading advocate for formulating the foundations of network science so that its predictions can be applied to real networks.

  • Ubiquity symposium: Evolutionary computation and the processes of life: some computational aspects of essential properties of evolution and life

    While evolution has inspired algorithmic methods of heuristic optimization, little has been done in the way of using concepts of computation to advance our understanding of salient aspects of biological phenomena. The authors argue under reasonable assumptions, interesting conclusions can be drawn that are of relevance to behavioral evolution. The authors will focus on two important features of life---robustness and fitness---which, they will argue, are related to algorithmic probability and to the thermodynamics of computation, disciplines that may be capable of modeling key features of living organisms, and which can be used in formulating new algorithms of evolutionary computation.

  • Ubiquity symposium: The science in computer science: how to talk about science: five essential insights

    The goal of the LabRats Science Education Program is to inspire secondary school-age students from all backgrounds to love learning about science and technology. Shawn Carlson, the Executive Director of LabRats, presents five key insights that can be integrated into any science and technology program. The purpose of which is to overhaul students' attitudes and motivation to learn. Carlson also offers detailed suggestions on how educators can use these insights to inspire their students to become lifelong learners of science and technology.

  • Science and the spectrum of belief: an interview with Leonard Ornstein

    In 1965 Leonard Ornstein wrote a long and thoughtful essay on information and meaning. Shannon's idea that communication systems could transmit and process information without regard to its meaning just did not seem right to him. He was particularly interested in how scientists use and interpret information as part of science. Forty-eight years later, he is sharing how he sees science, discovery, information, and meaning with Ubiquity Magazine.

  • Ubiquity symposium: The science in computer science: the sixteen character traits of science

    Phil Yaffe has provided numerous commentaries on various aspects of professional communication, which have helped readers more effectively articulate their own ideas about the future of computing. Here he tells us about how scientists see the world---the "scientific approach," he calls it---because he thinks many non-scientists see the world in a similar way. This realization can lower barriers of communication with scientists.

  • Ubiquity symposium: The science in computer science: broadening CS enrollments: an interview with Jan Cuny

    Until 2000, computer science enrollments were steadily increasing. Then suddenly students started turning to other fields; by 2008, enrollments had dropped by 50 percent. To that end, Jan Cuny has been leading a program at the National Science Foundation to increase both the number and diversity of students in computing. In this interview with Ubiquity, she discusses the magnitude of the problem and the initiatives underway to turn it around.

  • Ubiquity symposium: The science in computer science: computer science revisited

    The first article in this symposium, which originally appeared in the Communication the ACM, is courtesy of ACM President Vinton Cerf. Earlier this year, he called on all ACM members to commit to building a stronger science base for computer science. Cerf cites numerous open questions, mostly in software development, that cry out for experimental studies.

  • Ubiquity symposium: The science in computer science: opening statement

    The recent interest in encouraging more middle and high school students to prepare for careers in science, technology, engineering, or mathematics (STEM) has rekindled the old debate about whether computer science is really science. It matters today because computing is such a central field, impacting so many other fields, and yet it is often excluded from high school curricula because it is not seen as a science. In this symposium, fifteen authors examine different aspects from what is science, to natural information processes, to new science-enabled approaches in STEM education.

  • Ubiquity symposium: Evolutionary computation and the processes of life: the emperor is naked: evolutionary algorithms for real-world applications

    During the past 35 years the evolutionary computation research community has been studying properties of evolutionary algorithms. Many claims have been made---these varied from a promise of developing an automatic programming methodology to solving virtually any optimization problem (as some evolutionary algorithms are problem independent). However, the most important claim was related to applicability of evolutionary algorithms to solving very complex business problems, i.e. problems, where other techniques failed. So it might be worthwhile to revisit this claim and to search for evolutionary algorithm-based software applications, which were accepted by businesses and industries. In this article Zbigniew Michalewicz attempts to identify reasons for the mismatch between the efforts of hundreds of researchers who make substantial contribution to the field of evolutionary computation and the number of real-world applications, which are based on concepts of evolutionary algorithms.

  • Ubiquity symposium: Evolutionary computation and the processes of life: the essence of evolutionary computation

    In this third article in the ACM Ubiquity symposium on evolutionary computation Xin Yao provides a deeper understanding of evolutionary algorithms in the context of classical computational paradigms. This article discusses some of the most important issues in evolutionary computation. Three major areas are identified. The first is the theoretical foundation of evolutionary computation, especially the computational time complexity analysis. The second is on algorithm design, especially on hybridization, memetic algorithms, algorithm portfolios and ensembles of algorithms. The third is co-evolution, which seems to be under studied in both theory and practice. The primary aim of this article is to stimulate further discussions, rather than to offer any solutions.

  • Ubiquity symposium: Evolutionary computation and the processes of life: opening statement

    Evolution is one of the indispensable processes of life. After biologists found basic laws of evolution, computer scientists began simulating evolutionary processes and using operations discovered in nature for solving problems with computers. As a result, they brought forth evolutionary computation, inventing different kinds operations and procedures, such as genetic algorithms or genetic programming, which imitated natural biological processes. Thus, the main goal of our Symposium is exploration of the essence and characteristic properties of evolutionary computation in the context of life and computation.

  • Writing secure programs: an interview with Steve Lipner

    Protecting computing systems and networks from attackers and data theft is an enormously complicated problem. The individual operating systems are complex (typically more than 40 million lines of code), they are connected to an enormous Internet (on order of 1 billion hosts), and the whole network is heavily populated (more than 2.3 billion users). Hunting down and patching vulnerabilities is a losing game.

  • Bringing architecture back to computing: an interview with Daniel A. Menascé

    Over the past 10 or 20 years, the subject of machine organization and system architecture has been deemphasized in favor of the powerful abstractions that support computational thinking. We have grown accustomed to slogans like "computing is bits, not atoms"---suggesting that bits are not physical and the properties of the physical world are less and less important for understanding computation.

  • Dark innovation: An interview with Jerry Michalski

    As computing technologists, we tend to think of innovations in terms of new products or services supported by, or made of, computing technologies. But there are other types of innovation besides products. There are process innovations, such as McDonald's method of making hamburgers fast; social innovations, such as Mothers Against Drunk Driving; and business model innovations, such as Starbucks replacing a coffee shop with an Internet cafe. In all these categories, we tend to think of innovations as new ways of doing things that positively impact many people.

  • A 10 Point Checklist for Getting it Off the Shelf: An interview with Dick Urban

    Far too many R&D programs in industry as well as government result in reports or prototypes that represent fundamentally good ideas but end up gathering dust on a shelf. Ellison "Dick" Urban, formerly of DARPA (Defense Advanced Research Projects Agency) and now the Director of Washington Operations at Draper Laboratory, has had considerable experience with technology transition. We talked to him about his guidelines for success.

  • The Law, the Computer, and the Mind: An interview with Roy Freed

    2011 marked the 50th anniversary of the first educational program on computer law, sponsored by the Joint Committee on Continuing Professional Education of the American Law Institute and the American Bar Association (ALI-ABA). In 1971 at an ACM conference, Roy Freed and six colleagues founded the Computer Law Association (CLA), an international bar association (renamed later as the International Technology Law Association).

  • On experimental algorithmics: an interview with Catherine McGeoch and Bernard Moret

    Computer science is often divided into two camps, systems and theory, but of course the reality is more complicated and more interesting than that. One example is the area of "experimental algorithmics," also termed "empirical algorithmics." This fascinating discipline marries algorithm analysis, which is often done with mathematical proofs, with experimentation with real programs running on real machines.

  • Honesty is the best policy---Part 2: an interview with Rick Hayes-Roth

    Untrustworthy information is an increasing threat to decision making in information environments. Rick Hayes-Roth has been studying how to detect and filter away untrustworthy information and base decisions on well-grounded claims that can improve outcomes. In last week's installment of this two-part interview, we focused on the problem and the principles that help ameliorate it. In this installment, we focus on the means to implement the principles in our information environments.

  • Honesty is the best policy---part 1: an interview with Rick Hayes-Roth

    Untrustworthy information is an increasing threat to decision making in information environments. Rick Hayes-Roth has been studying how to detect and filter away untrustworthy information and base decisions on well-grounded claims that can improve outcomes. We interviewed him to find out more about this problem and get advice for our readers. Although there are many subtleties in the shades of truth and the intentions of speakers and listeners, Hayes-Roth finds the essential core of what you can do to ward off untrustworthy information.

  • An interview with Richard John: the politics of network evolution

    Richard John is a professor at the Graduate School of Journalism, Columbia University, and a historian of communications networks in the United States. His most recent book, Network Nation, won the inaugural Ralph Gomory prize from the Business History Conference and the AEJMC prize for the best book in the history of journalism and mass communications.

  • Empirical software research: an interview with Dag Sjøberg, University of Oslo, Norway

    Punched cards were already obsolete when I began my studies at the Technical University of Munich in 1971. Instead, we had the luxury of an interactive, line-oriented editor for typing our programs. Doug Engelbart had already invented the mouse, but the device was not yet available. With line editors, users had to identify lines by numbers and type in awkward substitution commands just to add missing semicolons. Though cumbersome by today's standards, it was obvious that line-oriented editors were far better than punched cards. Not long after, screen oriented editors such as Vi and Emacs appeared. Again, these editors were obvious improvements and everybody quickly made the switch. No detailed usability studies were needed. "Try it and you'll like it" was enough. (Brian Reid at CMU likened screen editors to handing out free cocaine in the schoolyard.) Switching from Assembler to Fortran, Algol, or Pascal also was a no-brainer. But in the late '70s, the acceptance of new technologies for building software seemed to slow down, even though more people were building software tools. Debates raged over whether Pascal was superior to C, without a clear winner. Object-oriented programming, invented back in the '60s with Simula, took decades to be widely adopted. Functional programming is languishing to this day. The debate about whether agile methods are better than plan-driven methods has not led to a consensus. Literally hundreds of software development technologies and programming languages have been invented, written about, and demoed over the years, only to be forgotten. What went wrong?

  • An interview with Bob Metcalfe: Bob Metcalfe is going meta on innovation

    Bob Metcalfe thinks we are in a bubble, an innovation bubble, seeing that the word "innovation" is on everybody's lips. To help ensure that this bubble does not burst, he has embarked on a new career path as Professor of Innovation and Murchison Fellow of Free Enterprise at the University of Texas at Austin. This is his fifth career, building on his work as an engineer-scientist leading the invention of Ethernet in the 1970s, entrepreneur-executive and founder of 3Com in the 1980s, publisher-pundit and CEO of InfoWorld in the 1990s, and venture capitalist in the 2000s. As General Partner with Polaris Venture Partners, he has invested primarily in cleantech and currently serves on the boards of five companies: Ember, Sun Catalyx, 1366 Technologies, Infinite Power, and SiOnyx.

  • An Interview with Peter Denning: the end of the future

    Ubiquity is dedicated to the future of computing and the people who are creating it. What exactly does this mean for readers, for contributors, and for editors soliciting and reviewing contributions? We decided to ask the editor in chief, Peter Denning, how he approaches the future, and how his philosophy is reflected in the design and execution of the Ubiquity mission. He had a surprisingly rich set of answers to our questions. We believe his answers may be helpful for all our readers with their own approaches to their own futures.

  • An interview with Melanie Mitchell: On complexity

    Melanie Mitchell, a Professor of Computer Science at Portland State University and an External Professor at the Santa Fe Institute, has written a compelling and engaging book entitled Complexity: A Guided Tour, published just last year by Oxford University Press. This book was named by Amazon.com as one of the 10 best science books of 2009. Her research interests include artificial intelligence, machine learning, biologically inspired computing, cognitive science, and complex systems.

  • Resurrecting the bullet point: the return of an old and valued friend

    PowerPoint has come under attack in recent years. Well known figures such as Edward Tufte have castigated PowerPoint for corrupting minds and numbing thought. Some sociologists have condemned it for luring people away from listening to each other and communicating effectively. Scott Adams (author of Dilbert) often depicts PowerPoint as a facilitator of office dysfunction. From all this, you might think PowerPoint has badly wounded us and our society with its barrage of bullet points.

  • Ubiquity symposium: What have we said about computation?: closing statement

    The "computation" symposium presents the reflections of thinkers from many sectors of computing on the fundamental question in the background of everything we do as computing professionals. While many of us have too many immediate tasks to allow us time for our own deep reflection, we do appreciate when others have done this for us. Peter Freeman points out, by analogy, that as citizens of democracies we do not spend a lot of time reflecting on the question, "What is a democracy," but from time to time we find it helpful to see what philosophers and political scientists are saying about the context in which we act as citizens.

  • Ubiquity symposium: What is information?: beyond the jungle of information theories

    Editor's Introduction This fourteenth piece is inspired by a question left over from the Ubiquity Symposium entitled What is Computation? Peter J. Denning Editor

    Computing saw the light as a branch of mathematics in the '40s, and progressively revealed ever new aspects [gol97]. Nowadays even laymen have become aware of the broad assortment of functions achieved by systems, and the prismatic nature of computing challenges thinkers who explore the various topics that substantiate computer science [mul98].

  • Ubiquity symposium: Biological Computation

    In this thirteenth piece to the Ubiquity symposium discussing What is computation? Melanie Mitchell discusses the idea that biological computation is a process that occurs in nature, not merely in computer simulations of nature.
    --Editor

  • An Interview with Joseph F. Traub

    Joseph F. Traub is the Edwin Howard Armstrong Professor of Computer Science at Columbia University and External Professor, Santa Fe Institute. In this wide-ranging interview, he discusses his early research, organizations and other entities he has created, and offers his view on several open-ended topics on the future of computing.
    --Editor

  • Ubiquity symposium: Natural Computation

    In this twelfth piece to the Ubiquity symposium discussing What is computation? Erol Gelenbe reviews computation in natural systems, focusing mainly on biology and citing examples of the computation that is inherent in chemistry, natural selection, gene regulatory networks, and neuronal systems.
    --Editor

  • Ubiquity symposium: Computation, Uncertainty and Risk

    In this eleventh piece to the Ubiquity symposium discussing What is computation? Jeffrey P. Buzen develops a new computational model for representing computations that arise when deterministic algorithms process workloads whose detailed structure is uncertain.
    --Editor

  • An Interview with Mark Guzdial

    Mark Guzdial is a Professor in the School of Interactive Computing at Georgia Institute of Technology (Georgia Tech). His research focuses on the intersection of computing and education, from the role of computing in facilitating education to how we educate about computing. In this interview with him, he discusses how we teach computing and to whom, especially his contention that a contextualized approach is a powerful tool to teach everyone about computing.
    --Editor

  • An Interview with Erol Gelenbe

    This is Part I of an interview with Professor Erol Gelenbe, conducted by Professor Cristian Calude. Gelenbe holds the Dennis Gabor Chair Professorship in the Electrical and Electronic Engineering Department at Imperial College London and is an associate editor for this publication. This interview also appeared in the October 2010 issue of the Bulletin of the European Association for Computer Science and is printed here with permission.
    --Editor

  • Ubiquity symposium 'What is computation?': Computation is process

    Various authors define forms of computation as specialized types of processes. As the scope of computation widens, the range of such specialties increases. Dennis J. Frailey posits that the essence of computation can be found in any form of process, hence the title and the thesis of this paper in the Ubiquity symposium discussion what is computation. --Editor

  • Ubiquity symposium 'What is computation?': Computation is symbol manipulation

    In the second in the series of articles in the Ubiquity Symposium What is Computation?, Prof. John S. Conery of the University of Oregon explains why he believes computation can be seen as symbol manipulation. For more articles in this series, see table of contents in the http://ubiquity.acm.org/article.cfm?id=1870596 Editors Introduction to the symposium. --Editor

  • An Interview with Prof. Andreas Zeller: Mining your way to software reliability
    In 1976, Les Belady and Manny Lehman published the first empirical growth study of a large software system, IBMs OS 360. At the time, the operating system was twelve years old and the authors were able to study 21 successive releases of the software. By looking at variables such as number of modules, time for preparing releases, and modules handled between releases (a defect indicator), they were able to formulate three laws: The law of continuing change, the law of increasing entropy, and the law of statistically smooth growth. These laws are valid to this day. Belady and Lehman were ahead of their time. They understood that empirical studies such as theirs might lead to a deeper understanding of software development processes, which might in turn lead to better control of software cost and quality. However, studying large software systems proved difficult, because complete records were rare and companies were reluctant to open their books to outsiders. Three important developments changed this situation for the better. The first one was the widespread adoption of configuration management tools, starting in the mid 1980s. Tools such as RCS and CVS recorded complete development histories of software. These tools stored significantly more information, in greater detail, than Belady and Lehman had available. The history allowed the reconstruction of virtually any configuration ever compiled in the life of the system. I worked on the first analysis of such a history to assess the cost of several different choices for smart recompilation (ACM TOSEM, Jan. 1994). The second important development was the inclusion of bug reports and linking them to offending software modules in the histories. This information proved extremely valuable, as we shall see in this interview. The third important development was the emergence of open source, through which numerous and large development histories became available for study. Soon, workers began to analyze these repositories. Workshops on mining software repositories have been taking place annually since 2004. I spoke with Prof. Andreas Zeller about the nuggets of wisdom unearthed by the analysis of software repositories. Andreas works at Saarland University in Saarbrücken, Germany. His research addresses the analysis of large, complex software systems, especially the analysis of why these systems fail to work as they should. He is a leading authority on analyzing software repositories and on testing and debugging. -- Walter Tichy, Editor
  • Ubiquity symposium 'What is computation?': Opening statement

    Most people understand a computation as a process evoked when a computational agent acts on its inputs under the control of an algorithm. The classical Turing machine model has long served as the fundamental reference model because an appropriate Turing machine can simulate every other computational model known. The Turing model is a good abstraction for most digital computers because the number of steps to execute a Turing machine algorithm is predictive of the running time of the computation on a digital computer. However, the Turing model is not as well matched for the natural, interactive, and continuous information processes frequently encountered today. Other models whose structures more closely match the information processes involved give better predictions of running time and space. Models based on transforming representations may be useful.

  • The New Ubiquity
    Ubiquity's new site will launch this month, marking a new editorial direction. Ubiquity is now a peer-reviewed online publication of ACM dedicated to the future of computing and the people who are creating it.
  • An Interview with Chris Gunderson: Are Militaries Lagging Their Non-State Enemies in Use of Internet?
    The increasing number of cyber attacks on military networks and servers has raised the question of what the global defense community is doing to safeguard military systems and protect the larger global Internet. Ubiquity's editor interviewed Chris Gunderson, who served in the U.S. Navy from 1973 to 2004 and became an expert in "network centric" warfare, on this question and in particular on how military philosophy must change to adapt to the rise of information networks.
  • An Interview with David Alderson: In Search of the Real Network Science
    David Alderson has become a leading advocate for formulating the foundations of network science so that its predictions can be applied to real networks. He is an assistant professor in the Operations Research Department at the Naval Postgraduate School in Monterey, Calif., where he conducts research with military officer-students on the operation, attack, and defense of network infrastructure systems. Ubiquity interviewed him to find out what is going on.
  • How to Generate Reader Interest in What You Write
    Who has not discovered to their dismay that no one wants to read their most carefully crafted, meritorious, compelling, and passionate writings? Think of all the proposals you have written that no one is interested in. Or the web pages, the blog posts, or the company brochures. Chances are, your failures are linked to an inability to connect with what your readers would be interested in reading. Our intrepid writer about writing, Phil Yaffe, offers some valuable insight into how to get people to read your stuff. He says you need to adopt the "expository writing challenge": that no one is interested in what you are inclined to write, therefore you must discover what they want to read. Only then you can get started, and only then you can succeed.
  • Is Design the Preeminent Protagonist in User Experience?
    We are gradually learning that "user experience" is a critical factor in customer satisfaction and loyalty. A positive experience means a happy customer who returns again. Designers of software systems and web services have been digging deeply into how they might generate a positive user experience. They are moving beyond anecdotes about excellent examples of user experiences and are developing design principles. Phillip Tobias gives us a fascinating account of the emerging design principles that will generate satisfied and loyal users.
  • Mind Hygiene for All: A Concept Map
    Maintaining mental sharpness and clarity is important to most everyone, and doing so is valuable for maintaining our professional edge. But we are under assault from many directions with challenges that can interfere with mental sharpness. Some of the challenges are familiar; others hide in the background. What are these challenges and what can we do about them? Goutam Saha has a very concise summary of everything contributing to mental hygiene, including the challenges and actions to meet them. He expresses this with mind maps, which themselves contribute to mental clarity.
  • How to Rapidly Improve Speaking Skills
    Even as written communication is important, spoken communication has been assuming an increasing role. We are called on to speak in such media as videos, teleconferences, and podcasts. Our ability to speak clearly is as important as our ability to formulate our arguments concisely and clearly. Phil Yaffe, who has provided advice to Ubiquity readers on how to write clearly and concisely, offers advice on how to speak clearly.
  • The Fallacy of Premature Optimization
    Moore's Law makes it seem as if resource limitations are always a minor consideration. If there will be twice as much memory for the same price in 18 months, why bother to squeeze a factor of 2 from an application's memory requirements? If the CPU will be twice as fast by then, why bother to shave some running time from a program? In other words, why bother to optimize programs? Isn't it better to just get them running and let Moore's Law take us off the hook when resources are constrained? Randall Hyde argues that optimization is important even when memory and processor double regularly. Trying to do the optimization too early can be a futile time-waster.
  • How Crafty Word Order Can Instantly Improve Your Writing
    I am usually very reticent about offering writing tips. Unless they are linked to the absolute, inescapable fundamental principles of good writing, such tips are too often poorly applied or misapplied. There is really only a handful of fundamental writing principles. Before this extraordinary tip can be properly revealed, we need to review three of them: 1) clarity, 2) conciseness, and 3) density.
  • An Interview with Peter Huber: Why 99.9 Percent Is Not Good Enough
    In the opening days of 2009, people are looking for the new President Obama to restore domestic and international confidence and help us find our way out of a dark recession. Electric power generation and distribution is a key part of a new direction. Can we produce enough of it to reduce our oil usage? Can electric cars become reliable and cover enough distance on a single charge? Can its availability be increased, especially since critical services in transportation, banking, computing and many other sectors can be shut down by power grid failures? In April 2000, Ubiquity Editor John Gehl spoke with energy expert Peter Huber about these issues. Huber's comments about the needs of the power grid were prophetic. We gladly bring them to you now in the hope that they will help you understand the power challenges ahead. --Peter Denning, Ubiquity Editor
  • Long Live the .250 Hitter
    The dearth of women in computing is very much on everyone's mind. Elena Strange offers a new perspective on this. She observes that the solid, utility hitters (and players) are the backbone of every baseball team. In playing on her computing teams she has no aspirations for MVP awards and strives for personal excellence in the things she does. She asks her male colleagues to value her as a .250 hitter without holding her to the standard of a .314 hitter. This simple change could open the gates to a flood of women in computing. Elena holds Grace Hopper as the equivalent of the legendary .314 hitter in computing. Hopper told her friends that she was never aspiring to be a legendary leader, but only to do the best possible job with the tasks that were before her. Be personally excellent and interact with people from your heart, said Hopper, and all the rest will take care of itself. You can see in Elena's story the seeds that Grace Hopper planted.
  • An Interview with Randy Pausch: Immersed in the Future: On the Future of Education
    Before he became ill, Randy Pausch spoke with Ubiquity Editor John Gehl in 2005. The declining enrollments in computer science were already very much on his mind. At that time, they were down 23 percent. Pausch called this a "huge problem". He noted that, even for those committed to teaching programming from the outset, kids programming in Alice were far more engaged than those trying to find Fibonacci numbers. The enrollments have since declined another 25 percent and the problem is even "huger" than before. Randy's ideas about what turns kids on are even more important today. --Peter Denning, Editor
  • An Interview with Frans Johansson: The Medici Effect
    In this time of recession, innovation has jumped to the fore in many people's minds. How can we create new value through innovations and pull our individual companies out of the doldrums? In 2004, Frans Johansson published his book, The Medici Effect, in which he discussed how crossing community boundaries leads to innovations, and he said that the most effective way to create the crossing is to mix people from the communities in a common setting. John Gehl spoke with Johansson shortly after the book was published. Johansson's words are worth thinking about now as we reflect on what we all must do next.
  • The Power of Dispositions
    Many people have been trying to come to grips with the new ways of learning that are supported by networked tools in recent years. These new ways feature distributed social networks at their core and are proving to be much more popular and often more effective than traditional schooling. Science communities such as faulkes-telescope.com and labrats.org, and massive multiplayer games such as World of Warcraft, are in the vanguard. John Seely Brown and Doug Thomas make an important contribution to understanding what makes these networks so powerful. They use the term disposition to refer to an attitude or stance toward the world that inclines the person toward effective practice. They find that a "questing disposition", which has always been important for inquiry and learning, is encouraged and supported in these vanguard social learning networks. Their work will reward your time and attention. --Peter Denning, Editor
  • An Interview with Michael Schrage
    It is November 2008 and much of the globe is in the throes of recession. Innovation is on many minds. We need new products and new services generating new value for our customers and our companies. It is more important than ever to innovate. The problem is that our collective success rate is abysmal -- 4 percent according to Business Week in August 2005. As we set out on new innovation initiatives, it is a good time to reflect on the illusions that drag our success rates so low. One illusion is that is innovation is a novel ideal or product, another is that those who spend more on R&D get more innovation, and another is that innovation is about great inventions. Michael Schrage of MIT has been challenging these illusions for a long time. He discussed them with Ubiquity editor John Gehl in February 2006. Now is the perfect time to reflect again on what Michael has to say to us about innovation. --Peter Denning, Editor
  • Presidential Politics and Internet Issues in the 2000 Election
    As with the US election of 2000, the US election of 2008 features two slates and four new faces running for the top offices. While many of the issues concerning the electorate are different in 2008 than in 2000, remarkably some issues are the same. We thought you might be amused at Doug Isenberg's resurrected reflections on the 2000 election. You can see what has changed and what has not.
  • Mirrorware
    As we use and design computing systems, Michael Schrage asks us to reflect on what these systems reveal of ourselves and not just what they reveal to others. We may find many surprises about design and privacy. In 1892, the newspapers published a series of editorials of leading thinkers about what the world would be like in 1992. (See Dave Walter, TODAY THEN, Am Geographical Union, 1992.) Collectively, they were almost 100 percent wrong. Their reflections revealed more about how they saw themselves than about the future. This is exactly what Michael Schrage is warning us about.
  • An Interview with Terry Winograd: Convergence, Ambient Technology, and Success in Innovation
    Terry Winograd is Professor of Computer Science at Stanford University, where he directs the program on human-computer interaction. His SHRDLU program done at the MIT AI Lab was one of the early explorations in natural language understanding by computers. His book with Fernando Flores, Understanding Computers and Cognition, critiqued the underlying assumptions of AI and much of computer system design, and led to completely new directions in those fields. He was a founder and national president of Computer Professionals for Responsibility. His remarks, made in 2002, are as relevant today as they were when first spoken.
  • Why Does Time Go Faster As We Get Older?
    Persons in every age group wonder why time seems to move so much faster than it did in their pasts. It seems as if there is never enough time to get everything done and that the situation only gets worse. Many explanations have been offered for this, but few seem to hit the target as well as Phil Yaffe's explanation. We hope you enjoy and find it provocative. Phil has been a writer and journalist for over four decades and is able to write eloquently about his personal experience with accelerating time.
  • The Three Acid Tests of Persuasive Writing
    If there are still scientists toiling away with little regard for what others may think of their efforts, it's time to drag them kicking and screaming into the 21st century. In today's interconnected world, what scientists do is of vital concern to the wider public. And vice versa. Just consider the controversies surrounding nuclear energy, genetic engineering, telephone antennas, global warming -- and even the effects of computers on education and individual liberty (surveillance society).
  • My Problem with Design
    I was reminded today of the things I find troubling about our modern notions of design and designing. Hundreds of years ago, if one wanted to become a designer, one would first have become a master craftsperson. We learned how to construct distinctive artifacts (and worlds of artifacts) and then we began to innovate in that tradition. To say one was a designer without that background would have been Harry Potteresque: ridiculous.
  • Information, DNA, and Change Through the Prism of a Great City
    After retiring from a career in academic/IT management at Carnegie-Mellon, Northeastern, and Washington and Lee Universities, John Stuckey is serving as Acting Chief Technology Officer at the American University in Cairo. This is the third of his reports to Ubiquity from Egypt.
  • Emergence of the Academic Computing Clouds

    Computational grids are very large-scale aggregates of communication and computation resources enabling new types of applications and bringing several benefits of economy-of-scale. The first computational grids were established in academic environments during the previous decade, and today are making inroads into the realm of corporate and enterprise computing.

    Very recently, we observe the emergence of cloud computing as a new potential super structure for corporate, enterprise and academic computing. While cloud computing shares the same original vision of grid computing articulated in the 1990s by Foster, Kesselman and others, there are significant differences.

    In this paper, we first briefly outline the architecture, technologies and standards of computational grids. We then point at some of notable examples of academic use of grids and sketch the future of research in grids. In the third section, we draw some architectural lines of cloud computing, hint at the design and technology choices and indicate some future challenges. In conclusion, we claim that academic computing clouds might appear soon, supporting the emergence of Science 2.0 activities, some of which we list shortly.

  • Wot do U think? (What Do You Think?)
    (NOTE TO READERS: Out of sheer curiosity I used a website that allowed me to translate text from English to the language used by those who send and receive text messages. The second part of this article contains a copy of the entire text that was thus translated.)
  • Technological Transformation of Human Experience
    This article was inspired by Don Ihde's work on the experience of technology in human-machine relations. (See Don Ihde. "The Experience of Technology," Cultural Hermeneutics, Vol. 2, 1974, pp. 267-279.)
  • Professor Andy Clark on Natural-born Cyborgs
    Bio of Dr. Clark: Dr. Andy Clark is a professor of philosophy and chair in logic and metaphysics at the University of Edinburgh in Scotland. Previously, he taught at Washington University at St. Louis and the University of Sussex in England. Clark is one of the founding members of the Contact collaborative research project, whose aim is to investigate the role environment plays in shaping the nature of conscious experience. Dr. Andy Clark research interests include philosophy of mind, artificial intelligence, including robotics, artificial life, embodied cognition, and mind, technology and culture. Dr. Clark's papers and books deal with the philosophy of mind and he is considered a leading scientist in mind extension. He has also written extensively on connectionism, robotics, and the role and nature of mental representation.
  • Thoughts on the Nature of the Virtual
    This article seeks to formulate some brief sociological and philosophical thoughts on the radically problematic nature and character of the virtual. These ultimately aim to critically challenge and reinvent the complex interrelations of contemporary virtuality to the real and the political. In such a context, new media studies acquire a normative impetus.
  • Can Learning Languages Help You Better Understand Science and Technology?
    "I was 24 years old when I first began thinking and speaking in a foreign language. It was like being released from prison. I saw my cell door swinging open and my mind flying free. That was over 40 years ago, but the picture is as fresh now as if it had just happened."
  • An Interview with Richard A. Demillo
    Richard A. DeMillo is the Dean of Georgia Tech's College of Computing. He previously was Hewlett-Packard's chief technology officer and served as director of the Georgia Tech Information Security Center. Under DeMillo's leadership, Georgia Tech's College of Computing has replaced the core curriculum for undergraduates with an ambitious and innovative Threads program, as he explains in this interview with Ubiquity's editor-in-chief John Gehl.
  • Information technology as an ethical challenge
    Information technology has an ambiguous impact on society. This situation calls for a two-level ethical analysis. On the one hand the issues of power and control must be reconsidered under the viewpoint of institutional structures, i.e., of living norms. On the other hand, the technological shaping of society, taking the character of power, oppression, verbosity and dogmatic belief, should be at the same time reconsidered under the viewpoint of a plurality of living forms, i.e., within a framework of deliberation and dissent. This paper presents briefly both issues, taking into account Michel Foucault's concept of "technologies of the self."
  • Dimension of Philosophy of Technologies: Critical Theory and Democratization of Technologies
    Philosophy of technology promises the possibility of an understanding of technology that may be important not only to public policy but also in helping to conceptualise intellectual approaches to the study of technology and, indeed, to shaping new fields of knowledge and research. Philosophy of technology may also have a role to play in relation not only to structuring a largely disparate and inchoate field but also more directly in teaching and learning about technology (Peters, et. al 2008).
  • An Interview with Wei Zhao
    Wei Zhao is currently the Dean of the School of Science at Rensselaer Polytechnic Institute. Before he joined RPI in 2007, he was a Senior Associate Vice President for Research at Texas A&M University. Between 2005 and 2007, he also served as the Director for the Division of Computer and Network Systems in the National Science Foundation. He completed his undergraduate program in physics at Shaanxi Normal University, Xi'an, China, in 1977. He received his M.Sc. and Ph.D. degrees in Computer and Information Sciences at the University of Massachusetts at Amherst in 1983 and 1986, respectively. During his career, he has also served as a faculty member at Amherst College, the University of Adelaide, and Texas A&M University. This interview was conducted by Ubiquity editor-in-chief John Gehl.
  • Mathematics by Jannat
    Arrgh!!! Well! If this is your reaction upon hearing the word MATH, you are not alone. You too are part of that ever increasing family which loves to hate it.
  • 21st Century Information Technology Revolution

    The computing power in the few micro processors that are now in a Ford Motor Car is much more than all the computing power that was put in the space vehicle that landed the first men on the moon and brought them back. In today's do-more-with-less business environment, with increasing demands from customers, shareholders, and regulators, the IT organization is not only asked to work harder and smarter, but is being asked to take on the role of assuring the business.

    Humanity has progressed from agricultural revolution to the industrial revolution and is now moving to an information revolution. It is this awesome computing power at continuously falling prices and the computers being networked over global telecom highways that is leading to the use of Information Technology in every sector of human activity be it communication, banking, trading, learning and teaching, entertainment, socializing, government, management and librarying. Just as machines have extended man's mechanical power and his convenience and comfort, Information Technology as commonly picturized by computers, is extending man's mind or brain or intellectual power. The term information technology has ballooned to encompass many aspects of computing and technology, and the term is more recognizable than ever before.

  • Technology based outsourcing K-12 mathematics and science teaching

    The author suggests that the teaching of mathematics and science in K-12 schools be outsourced to teachers in other countries whose students achieve better in mathematics and science. He outlines the advantages of using telecommunications technologies to outsource the teaching of mathematics and science.

  • Scarce resources in computing
    How we organize computing - and innovate with it - is shaped by what at any time is the most scarce resource. In the early days of computing, processing (and, to a certain extent, storage, which up to a point is a substitute for processing) was the main scarce resource. Computers were expensive and weak, so you had to organize what you did with them to make as much out of the processing capacity as possible. Hence, with the early computers, much time was spent making sure the process was fully used, by meticulously allocating time for users on the machine - first with scheduled batch processing, then with time-sharing operating systems that rationed processing resources to users based on need and budget.
  • SC08 broader engagement offers mentoring and travel assistance grants
    Austin, TX Interested in understanding what supercomputing means? Want to learn how next-generation computing, networking and storage technologies help to solve our worlds challenges and problems? Do you want to be in a place that brings together scientists, engineers, researchers, educators, programmers, system administrators and managers to discuss, discover and innovate the path forward for computing? If so, the SC08 Broader Engagement initiative might just be for you.
  • The non-autonomy of the virtual: philosophical reflections on contemporary virtuality

    Much contemporary talk of virtual 'worlds' proceeds as if the virtual could somehow be considered as in competition with or as an alternative to the world of the 'nonvirtual' or the 'everyday'. This paper argues that such a contrast is fundamentally mistaken, and that the virtual is not autonomous with respect to the everyday, but is rather embedded within it, and an extension of it.

  • Preface by Arun Tripathi to Jeff Malpas' 'The Non-Autonomy of the Virtual'
    Australian philosopher Jeff Malpas, author of Place and Experience, argues in his Ubiquity paper The non-autonomy of the virtual: philosophical reflections on contemporary virtuality that the virtual is not autonomous with respect to the everyday, but is rather embedded within it, and an extension of it. Within philosophy, Professor Malpas is perhaps best known as one of a small number of philosophers who work across the analytic-continental divide, publishing one of the first books that drew attention to convergences in the thinking of the key twentieth century American philosopher Donald Davidson and the phenomenological and hermeneutic traditions, as exemplified in the work of Heidegger and Gadamer.
  • Employee retention: By way of management control systems

    Loyalty is passé in the modern time and professionalism is the buzzword in the contemporary corporate world. The reasons of employee attrition are also changing. Now-a-days employee leaves an organization for many reasons. Some leave for growth, some leave for some family problems but majority of people switchover jobs due to only one reason that is DISSATISFACTION. Undoubtedly satisfaction and dissatisfaction sort of things have different meaning for different people but having majority of people satisfied is pretty germane for sustainable growth and high level of productivity in any organization. A threadbare analysis of attrition brings some major concerns like not having objectivity in job allocation, employee recognition and fairness in career advancements for consideration. They are also important causes of dissatisfaction for employees in organizations. This dissatisfaction finally gets a vent in the form of changing the job. This subjectivity in the issues of employee handling is the key for dissatisfaction. Management Control System is fully capable of bringing objectivity in the organization and managing this dissatisfaction which would finally be translated into high employee retention, and better productivity and better organizations.

  • Mental Models: Aligning design strategy with human behavior
    I ndi got her BS in Computer Science from Cal Poly and began her master's at Colorado State. She then worked as a software engineer, later managing Web applications that focused on the user. Her concepts in mental models derive from attempting to bridge the developer-user gap. Her expertise ranges from structuring crossfunctional teams, to managing participant recruiting, and conducting user interviews, thereby creating effective tools for exchanging results.
  • How to use presentation slides to best effect
    How often have you attended a presentation where great attention apparently went into designing the slides - and apparently none into how they were used? Or the speaker played with the slides as if to entertain rather than edify?
  • Smart phones: A tutorial
    1. INTRODUCTION Pervasive Computing integrates computation in the environment rather than computers which are distinct objects. Other terms for pervasive computing ubiquitous computing, calm technology, things that think and everyware. In other pervasive computing means computers everywhere, making them available the physical environment while making them effectively invisible to the user. having a desktop or a laptop machine, the technology pervasive computing embedded in the environment. Ubiquitous technology is often wireless, networked making its users more connected to the world around them and the it. Through pervasive computing, users use today's digital tools like laptops, phones, PDAs, smartphones to communicate and exchange information in different and conceive and use the geographical and temporal spaces differently. In being an aspect of information dissemination, pervasive or ubiquitous computing global and local, social and personal, invisible and visible at the same time.
  • The rise and fall of a good programmer
    Of all the sayings I dislike, the most vapid is one I have heard as long as I have been working with IT: We will have the paperless toilet before we have a paperless office. Normally uttered with a dry cackle and a finger pointed towards my office, which does not lack for paper.
  • Time to get serious about the paperless office
    Of all the sayings I dislike, the most vapid is one I have heard as long as I have been working with IT: We will have the paperless toilet before we have a paperless office. Normally uttered with a dry cackle and a finger pointed towards my office, which does not lack for paper.
  • An Interview with Vaughan Merlyn on Management
    Vaughan Merlyn, who is a management consultant, researcher, and author, has had as his primary focus for more than three decades now has been the use of information and information technology for business value creation. He was interviewed about software consulting and management.
  • Avoiding disaster when your hard drive fails
    No one really expects a disk crash, but they do happen, usually at the most unconvenient times. Having a quick and easy-to-restore backup can eliminate both the distress and expense of the prolonged downtime normally associated with a hard drive failure. When restoring from a drive failure, the best kind of backup to have is an image backup.
  • Anticipating and resolving resource overloads
    The Concept of a Project Resource In the context of project management, a resource is any entity that contributes to the accomplishment of project activities. Most project resources perform work and include such entities as personnel, equipment and contractors. However, the concept of a resource (and the techniques of resource management presented in this paper) can also be applied to entities that do not perform work, but which must be available in order for work to be performed. Examples include materials, cash, and workspace. This paper focuses on the resource that is of greatest concern to most organizations personnel. In a project management system, personnel resources may be identified as individuals by name or as functional groups, such as computer programmers.
  • Why visual aids need to be less visual
    I was recently invited to a presentation by an accomplished speaker. Needless to say, his speech was well structured, his manner relaxed and confident, his eye contact and body language excellent, etc. He normally spoke without slides, but this time he felt they would reinforce and illuminate his message. They didnt. In fact, they were more of a hindrance than a help.
  • Interview with MIT's Robert Langer
    Dr. Robert Langers work is at the interface of biotechnology and materials science. A major focus is the study and development of polymers to deliver drugs, particularly genetically engineered proteins, DNA and RNAi, continuously at controlled rates for prolonged periods of time.
  • Out sourcing-off shoring: how will you derive the value?
    The equations of global economies are changing fast due to Industry consolidation, merging, acquisitions and on account of industries hunger for global hunt and race of scaling-up higher and higher there by creating stiff competition among their rivals. The business houses are deeply evolved in looking at techno commercially viable IT solutions to grow at faster speed, support the business effectively and provide competitive advantage by using the latest emerging technologies.
  • Arrogance or efficiency? a discussion of the Microsoft office fluent user interface
    1 Introduction I was writing an e-mail message the other day using Microsoft Office Outlook 2007 and clicked on the button for adding one of my signature blocks. Presto! Most of my message disappeared! Investigation and testing showed that the behavior was unpredictable; sometimes, only the existing default signature was replaced by the new signature but occasionally the program became confused and wiped out portions of the text as well.
  • Is a worldwide common language just over the horizon?
    I am an American living in Belgium since 1974. Ever since arriving here, I have been hearing the mantra To be a good European, you should learn several languages. Almost from the very beginning, I suggested going the other way: To be a good European, everyone should learn a single common language.
  • Why track actual costs and resource usage on projects?
    The importance of tracking actual costs and resource usage in projects depends upon the project situation. For some projects, tracking actuals is unnecessary or is not worth the effort required. In other cases, however, tracking actual costs and resource usage is an essential aspect of the project control function. In such cases, a system must be put into place to support the tracking process, and the collection/recording of the potentially voluminous quantity of data requires strong organizational discipline. Why then is tracking actual costs and resource usage on a project ever worth the effort required to accomplish it?
  • An Interview with Dr. Yi Pan of Georgia State University
    Ubiquity is proud to publish this inspirational interview, which starts with a discussion of the creation of the computer science department at Georgia State University, and concludes with the heroic efforts an impoverished student from Tsinghua University in China overcame many obstacles to rise to a significant position at Georgia. The interviewee is Yi Pan, Chair and Professor of Georgia State University's computer science department, who provided us with these inspirational reflections on computer science, academic success, and true success. The interview was conducted by Ubiquity editor-in-chief John Gehl.
  • An Interview with Michael Schrage on Ubiquity
    Author of several acclaimed books and numerous articles in such publications as Fortune and Technology Review, Michael Schrage is also a world-traveling consultant to all businesses great and small. He has been at MIT for many years, and his new academic home will be in that institution's Sloan Management School.
  • Hermeneutics facing the
    The origin of this paper goes back to the International Conference "Phenomenology and Technology" held at the Philosophy and Technology Studies Center, Polytechnic University (New York), October 2- 4, 1986 which was organized by Wolfgang Schirmacher and Carl Mitcham. After thirteen years, obviously, things have changed and I have done some further work too. My book Hermeneutik der Fachinformation was published in 1986 and since then I have written some articles on this subject as well as another book Leben im Informationszeitalter (Capurro 1995). Some of the articles as well as a list of publications can be found in my homepage (http://www.capurro.de). The present text is an enriched version of the original one. I have added some later insights without changing the basic ideas which I still think are valuable and can also be of help when reflecting, for instance, about the nature of communicating and searching for information in the Internet.
  • ERP system replacement criteria

    An ERP system is our information backbone and reaches into all areas of our business and value-chain. Replacing it can open unlimited business opportunities. The cornerstone of this effort is finding the right partner and specialist. Our long-term business strategy will form the basis of the criteria for our selection of an ERP system replacement. Our ERP provider must be part of our vision. It is the duty of a software provider to help us to get there by doing their part to make sure our next system will be our last ERP system replacement. Some of the criteria that allow us to identify and select the solution that will meet these expectations.

  • Collective intelligence: include the disabled for success
    Are you looking for new ideas to leverage your IT to allow your workforce to collectively be more efficient at solving complex problems? Want to transform your corporation so that it can reach new heights?
  • Whatever happened to cybernetics?
    Has the discipline of cybernetics been unable to recognize and respond to appropriate "midcourse corrections" and in the process had its destiny imposed by external "turning points"? (A mid-course correction is an endogenously determined action as in the classical "sense-processact" sequence, whereas a turning point is simply a reaction to that which is exogenously imposed; i.e., "being overcome by events."
  • Renovation of minimum spanning tree algorithms of weighted graph

    In this paper we describe and explain the Minimum Spanning Tree (MST) generation algorithms of a weighted graph with renovated idea. Here, we used a new cycle testing algorithm for testing cycles, if required, in generation of Minimum Spanning Tree. The reason behind this is to optimize the execution time for cycle testing. Also, we describe some Minimum Spanning Tree algorithms for weighted graph with a renovated idea. We applied here new concept for explanation of minimum Spanning tree with better time complexity.

  • Understanding software testing concepts
    Software testing concepts have been briefly described in this article. Readers would find it easier to understand fundamental concepts of software testing by going through a concept map thereof. Software testing is itself a discipline as well as a process. Software development is nothing but a process of coding functionality in order to meet the defined end-user requirements. We can think of software testing as an iterative process, which consists of Tests Designing, Tests Execution, Problems Identifying and Problem Fixing, for validating functionality and as well as for attempting the software break. Software testing aims to find problems and to fix them for improving software quality. Software testing may represent 40% of a software development budget. Basic methods of performing software testing include Manual Testing and Automated Testing. Manual software testing is the process of manually testing software (having the possible forms for example, user interfaces navigation, information submission, or attempt to hack the software or database etc.), carried out by an individual or individuals. Manual software testing is labor-intensive and slow. On the other hand, automated software testing is a process of creating test scripts, which can be run then automatically, repetitively, and through a number of iterations. Automated software testing helps us to minimize the variability of results, speed up the testing process, increase test coverage (that is, the number of different things tested), and ultimately provide greater confidence in the quality of the software being tested.
  • End laptop serfdom
    Time to end personal technology serfdom! I hate company-specific technology standards, at least those that specify technology in terms other than file formats, access protocols and application programming interfaces. In most companies I am in touch with, employees get a laptop and a cell phone and are required to use a set of standard capabilities of some sort. More often than not these are unnecessarily complicated, old-fashioned, expensive and singularly uninspiring. This is often for good reasons: The IT department wants to make things manageable for themselves and for the organization, and employees need to have a standard frame of reference and a compatible set of tools for work. The helpdesk can figure out which keys to press and the employees can see the same screens. Well and good, but the users are beginning to rebel at the lack of options especially those they have on their own or former computers.
  • About english: On the other hand
    I read Philip Yaffe's two recent Ubiquity pieces with interest, all the more so because I myself have plunged back into an international experience after sampling the delights of retirement for a year.
  • How many Americans does it take to change a light bulb?
    I changed a light bulb yesterday. Or, rather, I had the thing changed. I told an Egyptian friend I had a couple of bulbs burnt out and was assuming I could pick some up at one of the larger new supermarkets. He looked at me with barely concealed pity for my ignorance. No, he said. They don't carry light bulbs in a market. Later, in e-mail, he would spell it "light pulp," which is an image I find quite intriguing. Where's Einstein when you need him? Might light pulp be what glows inside the glass, I wonder?
  • Serial port data communication using MODBUS protocol
    Serial communication is the process of sending data sequentially one bit at a time, over a communication channel or computer bus [5,6,7]. RS-232 is a standard for serial binary data transfer between a data terminal equipment (DTE) and a data circuit-terminating equipment (DCE), commonly used in computer serial ports.
  • Is the GMO controversy relevant to computer ethics?
    Computing and information technology professionals have exhibited high standards of engagement with ethical issues relating to privacy, information security and abuse of the technical capabilities they have been responsible for developing. But one can argue that computing capability is implicated in ethical controversies that receive relatively little discussion within the IT community. Stem cell research, nanotechnologies and other controversial areas of science would be impossible without the computational capacity of information processing. In many instances, the downstream applications of computer technology are deeply involved in the issues surrounding contested technologies.
  • An approach for conducting enterprise resource planning assessment

    The failure to plan will lead to results that fall short of expectations. The same thing can be said of companies and their search for a new Enterprise Resource Planning (ERP) system. All companies undertake the search for a new system because they believe there is an opportunity to improve the organization, either by improving the revenue, decreasing costs or both.

    But far too often, companies undertake this effort without having a plan on how to select a new ERP system. They fall victim selecting the product with the best sales presentation or the best cost proposal. There is not quantitative evidence that this new system will actually achieve the goals of improving the revenue or decreasing the cost.

    So put an evaluation plan in place. No two plans will be the same but all should have the same basic concepts.

  • Unleashing Web 2.0: From concepts to creativity
    Vossen is both an IS & CS Professor at the University of Muenster, and also served as European Editor-in-Chief for Elseviers international information systems journal. Hagemann is his PhD student whose area of research is Web technology.
  • Techniques of persuasive communication: old wisdom in a new package
    What you are about to read will probably sound familiar. Indeed, it has been said many times before. However, I believe this formulation is original and may help you better apply it in your marketing communication. I immodestly call it Yaffes Law.
  • Technology transfer and modernization: what can philosophers of technology contribute?

    Technique- or technology-transfer is based in many ways on technological and economic paths that were often created by European colonization and have been intensified by Industrialization and Globalization. On the one hand, the modern age is a constantly developing planetary truth, a truth that impacts every society in the world. On the other hand, societies in third world countries have not produced this condition themselves because modernity is an external imposition. This means the modern age turns to be an unavoidable destiny for them. Traditional modernization and technology transfer abstract from almost all contextual factors. That is why technological development and modernization are being compared across continents and assessed with more or less value, not considering the cultural and social contextual circumstances. This supposes on one hand that the western way into the modern age has a model character, is normative and there are no alternatives to it. Also, it is supposed that the modern age is a desirable objective and that compensation leads to equal final situations.

    A requirement for technological standards and for technology transfer are innovations which constantly promise new development paths and stable institutional settings that can be monitored over a long period. These setting have to be ensured by the cultural system, especially by their social-economical dimensions. Also, the religious dimension is a part of this setting, and in Africa and South-east Asia is closely connected to the form and style of life and to culture. Secularization comparable to the western world only takes place in major cities, as they are islands of modernization. For thousands of years, technological process innovations such as technology transfer have been digested by cultural embedding and not through modernization. This might vary from place to place but in general it shows the same development. Heteronomous transfers meet culturally motivated resistance or are ignored, whereas the circumstances of technology transfer are a bit different. It is not being identified with culture transfer and does not automatically lead to a broad modernization but to a form of development with a speed of cultural adjustment - for sure slower than required by modernization. But this development can mostly be digested with the help of the embedding-paradigm. It is our task to generate forms of modernization with consideration for cultural embedding and traditions.

  • A guide for project-based manufacturers and secrets for software buyers

    Most systems have their heritage in the Material Requirements Planning (MRP) philosophy developed in the 1960s. This concept utilized computer power to calculate time-phased material requirements. It later evolved into MRPII promoted by APICS and Ollie Wight during the 1980s, and further evolved to the Enterprise Resource Planning (ERP) systems available today.

    The original premise of all of these systems is that material planning is the center of the universe. The typical manufacturing system was designed with an MRP process at the heart of the system. The emphasis of such systems is on standard bills and routings and standard costs.

    Companies in the ETO world have different requirements. Designing and building complex products to exact customer specifications frequently involves long lead times and heavy engineering content. To win business, you must provide accurate estimates and quotations to a demanding customer base. Unlike the majority of manufacturers, capital equipment manufacturers typically purchase material to a specific project or job. You need to do progress billing and collect actual costs to projects. Often, you will not receive payment for a project until it is installed and operating at a customer's site. So, cash management is of vital importance. And after the sale, you need to track warranty information and provide aftermarket services, including the sale of spare parts that may constitute a significant share of your company's business.

  • Understanding dependable computing concepts

    This work aims to visually describe the important concepts of a dependable computing system and the relationships between the concepts. The concept map here for dependable computing system concepts would help us for easier and meaningful understanding of this emerging important research topic of computer science and engineering. Readers here won't feel tired of reading long text horizontally lines after lines for conceptualizing this much research and interesting topic.

  • Economic recognition of innovation

    Globalization has benefited the economies of member countries of the Organization for Economic Cooperation and Development (OECD) by helping their businesses stay profitable through cost-effective outsourcing of mostly garden-variety tasks and some knowledge-based activities. With time, the latter will account for the lion's share of work outsourced and emerging export houses will also tend to cater more to their own domestic markets because of their expanding infrastructure and growing manpower possessing advanced skills. This will result in a leveled playing field coaxing developed countries to adopt widespread innovations to maintain their high perch in the economic pecking order. Such large-scale creativity can be managed better if it could be gauged with an appropriate measure. This work propounds a new economic measure called the Gross Domestic Innovation (GDI) to quantify innovations in OECD countries. It will supplement universal measures such as the Gross Domestic Product (GDP), productivity and numbers concerning employment. Apart from the methodology for its estimation, the impact of GDI on the various facets of a vibrant economy is discussed and inter alia, the role of GDI in fighting inflation and alleviating the negative influences of globalization is stressed. Also, a tentative analysis on the economies of U.S., Japan, Germany and China is presented to illustrate the concept.

  • Ubiquity interview with Neumont's Graham Doxey
    Neumont University in Salt Lake City was featured in Ubiquity two years ago, with an interview with one of its founders, Scott McKinley. We wanted to go back and see how they're doing at this new and unique institution, about which senior vice president Julie Blake has explained: "The industry has said for years that even our best universities aren't preparing students for the workplace. Neumont was founded to fill that niche." Below is a Ubiquity interview with Neumont cofounder and President Graham Doxey.
  • AI re-emerging as research in complex systems
    The history and the future of Artificial Intelligence could be summarized into three distinctive phases: embryonic, embedded and embodied. We briefly describe early efforts in AI aiming to mimic intelligent behavior, evolving later into a set of the useful, embedded and practical technologies. We project the possible future of embodied intelligent systems, able to model and understand the environment and learn from interactions, while learning and evolving in constantly changing circumstances. We conclude with the (heretical) thought that in the future, AI should re-emerge as research in complex systems. One particular embodiment of a complex system is the Intelligent Enterprise.
  • Reflections on the philosophy of technology culture of technological reflection
    "Philosophers point out the liabilities, what happens when technology moves beyond lifting genuine burdens and starts freeing us from burdens that we should not want to be rid of." (Albert Borgmann)"The unintended consequences and dangers of technologization are real, and they deserve reflections and replies. Meanwhile the deeper danger of cultural and moral devastation goes unnoticed and is to some extent eclipsed by attention to the overt dangers (which, to repeat, need to be addressed forthwith)." (Albert Borgmann)
  • The waning importance of categorization
    The mobile phone has caused us to plan less and communicate more. The Internet causes us to categorize less and search more - and media's increasing Internet nervousness is driven not just by fear of diminishing revenues but from the fear of a loss of importance of categorization. When everybody can find everything and networked computers determine what is relevant, media companies lose their ability to create agendas. To maintain their influence, they will need to let the Internet shape their main products, not desperately try to keep the world as it is.
  • Electronic scanning in space of the planar array of four patch antennas
    The rectangular patch antenna is set to play a significant role in the development of the next-generation wireless communication systems. The purpose of this report is to provide the design of the rectangular patch antenna system by studying the performance of patch antenna array, and to achieve the electronic scanning in space of the radiation patterns by a four rectangular patch antennas plan array. The designed patch antenna square array will have an array of four elements and its performance will be evaluated in terms of radiation patterns.Results given by the MATLAB and PCAAD (Personal Computer Aided Antenna Design) software will be tabulated and antenna radiation patterns will be plotted for discussion before wrapping up with a conclusion and suggestion on future developments.
  • Corporate renewal engines
    The great, long-living companies are able to adapt to tectonic market shifts and historical changes while crossing different technological epochs. Several books have captured historical and anecdotal evidence about such extraordinary businesses, while we have a few crisp and simple business models of these great companies.
  • A low-cost testing for transient faults
    Unconventional and low - cost software implemented testing technique for processor transient faults are briefly discussed here. On-line signatures of the Processor Status Register have been used here for detecting transient faults.
  • An Interview with Scott McKinley: Project-Based Learning: The Neumont University story
    Neumont University co-founder and CEO Scott McKinley says the most innovative aspect of the Neumont curriculum is its focus on student projects: "Our freshmen are on project teams from the very beginning. Their first projects are simple, heavily scaffolded, and commensurate with their novice skills. By the time they enter their last three quarters, they're working on real industry projects for serious names that work with us, including IBM and Microsoft."
  • Artificial and Biological Intelligence
    Subhash Kak of Louisiana State University says that "humans will eventually create silicon machines with minds that will slowly spread all over the world, and the entire universe will eventually become a conscious machine."
  • Mailbag
    In his article 'Artificial and Biological Intelligence,' Subhash Kak of Louisiana State University asks if 'humans will eventually create silicon machines with minds that will slowly spread all over the world, and the entire universe will eventually become a conscious machine?' These are some comments on his paper.
  • An Interview with Alan Lenton: On Games
    Noted U.K. game designer Alan Lenton talks about his award-winning multi-player game Federation and discusses the sociology and psychology of gaming.
  • INDUS: A New Platform for Ubiquitous Computing
    Kallol Borah began development of the Indus project at the Indian Institute of Technology Madras in 2002. Indus demonstrates how general purpose object oriented programming languages can be extended to enable ubiquitous computing applications.
  • A Three-Dimensional Model for Evaluating Software Development Projects
    In this model created by Dr. K.V.K.K. Prasad, software development is viewed in two dimensions (despite the title), based on the answer to the questions: 1) Is it inspired by considerations of utility and value? 2) Does it advance software technology?
  • An Interview with William P. Dunk: On Collaboration
    Management consultant and futurist William P. Dunk says, "What collaboration is about is distributed intelligence, and I think that systems and governments and companies are all in such a degree of gridlock now that we desperately need to have broad-based intelligence coming into play everywhere."
  • An Interview with John Markoff: What the dormouse said
    John Markoff is author of the new best-seller "What the Dormouse Said: How the 60s Counterculture Shaped the Personal Computer Industry," and is a senior writer for The New York Times. His other books include "Cyberpunk: Outlaws and Hackers on the Computer Frontier" and "Takedown: The Pursuit and Capture of Kevin Mitnick, America's Most Wanted Computer Outlaw."
  • IT job outsourcing
    Bhumika Ghimire, who is from Nepal, is a graduate of Schiller University, where he studied IT Management and where outsourcing was his special field of interest. Here, he asks, "How do we define outsourcing?"
  • An Interview with F-H Hsu: Chess, China, and Education
    Feng-Hsiung Hsu, whose book "Behind Deep Blue" told the story of world chess champion Garry Kasparov was defeated by the IBM computer known as Deep Blue, is now a senior manager and researcher at Microsoft Research Asia.
  • An Interview with Leonard Kleinrock on nomadic computing
    Leonard Kleinrock developed the mathematical theory of packet-switching, the technology underpinning the Internet, while a graduate student at MIT a decade before the birth of the Internet which occurred when his host computer at UCLA became the first node of the Internet in September 1969. He is now at UCLA, where he is Professor of Computer Science. He has won numerous awards and prizes for his achievements.
  • Reflections on challenges to the goal of invisible computing
    "Technology becomes subordinate to values through economics, government, or the professions. Our biggest problem is learning to recognize that we do have options, albeit often limited ones. Our tendency is to just create more technology rather than ask why." (Carl Mitcham, as he articulates the thesis of Albert Borgmann on the relationship between contemporary technologies and human values)
  • You should use both sides of your brain, right?
    Author Dan Pink argues that "nowadays, the fault line between who gets ahead and who doesn't is going to be mastery of these abilities that are more characteristic of the right hemisphere — artistry, empathy, big picture thinking. Those are the sorts of abilities that I think are really going to matter the most, not only in our individual career success, but also in our personal satisfaction."
  • Building smarter: an interview with Jerry Laiserin
    Architect and industry analyst Jerry Laiserin is an advocate for "building smarter" - the application of information technology to transform the way the built environment is designed, constructed and operated. His technology strategy publication, the LaiserinLetter, can be found at.
  • Science and Engineering of Large-Scale Complex Systems
    The world's economy can be seen as a an excellent playing field for the multiple, multi-faceted scientific disciplines and scientists. But for various reasons and causes, they are or disregarded or sometimes even carefully avoided. Kemal Delic, a lab scientist with Hewlett-Packard's R&D operations and a senior enterprise architect, explains.
  • Joseph Konstan on Human-Computer Interaction: Recommender Systems, Collaboration and Social Good
    An interview with Joseph Konstan: Konstan is an associate professor of computer science at the University of Minnesota. His background includes a bachelor's degree from Harvard College and a PhD from the University of California-Berkeley. His principal interests are human-computer interaction, recommender systems, multimedia systems, information visualization, internet applications and interfaces.
  • PCs in the classroom & open book exams
    What are the motivations behind giving an open-book/open-notes exam? Does giving free access to all of the resources of the Internet conflict with these motivations?
  • Leonard and Swap on 'Deep Smarts'
    An interview with Dorothy Leonard and Walter Swap: The first issue that any organization has to face is the identification of the deep smarts. Dorothy Leonard and Walter Swap are co-authors of the new book 'Deep Smarts: How to Cultivate and Transfer Enduring Business Wisdom.' Leonard is a professor emerita at the Harvard Business School and Swap is a professor of psychology emeritus at Tufts, where he was also dean of the college.
  • A Concise Guide to the Major Internet Bodies
    The bodies responsible for the Internet's protocols and parameters can be said to steer the Internet in a significant sense. This document, by Alex Simonelis of Dawson College in Montreal, is a summary of those bodies and their most important characteristics.
  • Ken Sevcik on Performance Evaluation
    Ken Sevcik is Professor of Computer Science at the University of Toronto. He received his B.S. in 1966 from Stanford University and his PhD in 1971 from the University of Chicago. Sevcik joined the faculty at the University of Toronto in 1971, and was Chair of the Department from 1990 to 1992. He also served as Director of the Computer Systems Research Institute (CSRI). His research interests are in the use of analytic models for performance analysis of resource allocation, scheduling and file structures in computer systems, computer networks, and distributed data management systems.
  • Anita McGahan on Industry Evolution
    Anita M. McGahan is author of the new book 'How Industries Evolve: Principles for Achieving and Sustaining Superior Performance' (Harvard Business School Press). She is the Everett V. Lord Distinguished Faculty Scholar and Professor of Strategy & Policy at the Boston University School of Management, as well as a Senior Institute Associate at Harvard's Institute for Strategy and Competitiveness.
  • Ken Robinson on Telecom Policy
    Ken Robinson is a communications attorney in Washington, having worked at the Departments of Justice and Commerce, the FCC, and the Office of Telecommunications Policy during the Nixon Administration. He is editor of the weekly publication 'Telecommunications Policy Review.'
  • Czerwinski on Vizualization
    Mary Czerwinski is Senior Researcher and Group Manager Visualization and Interaction Research Group at Microsoft Research.
  • Mihai Nadin on Anticipatory Systems
    What is the difference between a falling stone and a falling cat? Mihai Nadin, who directs the newly established Institute for Research in Anticipatory Systems at the University of Texas at Dallas, holds a Ph.D. degree in aesthetics from the University of Bucharest and a post-doctoral degree in philosophy, logic and theory of science from Ludwig Maximilian University in Munich, West Germany. He earned an M.S. degree in electronics and computer science from the Polytechnic Institute of Bucharest and an M.A. degree in philosophy from the University of Bucharest. He has authored 23 books, including "The Civilization of Illiteracy," "Mind: Anticipation and Chaos," and "Anticipation: The End is Where We Start From."
  • What makes users unhappy: share-point team services web server security
    Computer & Internet Security is very important but sometimes it is so confusing and frustrating that it makes users very unhappy to a point where the system is so secure that it cannot be used by its most legitimate users, like system administrators
  • Review of Activity-Centered Design
    With new insights to a well-documented topic, this book offers an excellent incentive and useful tools for system designers to pursue activity-centered design.
  • Michael Schrage on Innovation
    Looking for the great clients who are the true innovators? Co-director of the MIT Media Lab's eMarkets Initiative, a senior advisor to MIT's Security Studies Program, and a consultant to MIT's Langer Labs on technology transfer issues, Michael Schrage conducts research on the economics of innovation. His particular focus is on the role of models, prototypes and simulations in managing interactive iterative design, an area in which he works with a number of companies.
  • Mihai Nadin on Anticipatory Systems
    What is the difference between a falling stone and a falling cat? Mihai Nadin, who directs the newly established Institute for Research in Anticipatory Systems at the University of Texas at Dallas, holds a Ph.D. degree in aesthetics from the University of Bucharest and a post-doctoral degree in philosophy, logic and theory of science from Ludwig Maximilian University in Munich, West Germany. He earned an M.S. degree in electronics and computer science from the Polytechnic Institute of Bucharest and an M.A. degree in philosophy from the University of Bucharest. He has authored 23 books, including "The Civilization of Illiteracy," "Mind: Anticipation and Chaos," and "Anticipation: The End is Where We Start From."
  • Computing or Humanities?
    The application of computing to research problems in the humanities is not new...
  • Reflections on the Limits of Artificial Intelligence
    Nature is very simple and efficient in everything she makes, and is extremely obvious. We humans like to simulate in an extremely complicated manner what exists quite simply in nature, and what we succeed in simulating falls in the category of artificial intelligence. Artificial intelligence has limits of scope, but they fade away when compared with the performances of natural intelligence. In this study, we undertake to outline some limits of artificial intelligence compared to natural intelligence and some clear-cut differences that exist between the two.
  • Technology footnotes: international time line
    In the days of hot type, magazine content was set in film. This writer offered "intriguing" suggestions for making publications more appealing to international audiences.
  • Checking in with Ben Bederson
    By focusing on the user experience, the University of Maryland's Human-Computer Interaction Lab aims to improve lives through projects such as the International Children's Digital Library. Benjamin B. Bederson, interviewed here, is an Associate Professor of Computer Science and director of the Human-Computer Interaction Lab at the Institute for Advanced Computer Studies at the University of Maryland, College Park. His work is on information visualization, interaction strategies, and digital libraries.
  • Patterns for Success
    Scott D. Anthony speaks about using innovation theory to transform organizations and create the next wave of growth. Anthony is a partner at Innosight, a management, consulting and education company located in Watertown, Massachusetts, and is co-author with Clayton M. Christensen and Erik A. Roth of the new book, "Seeing What's Next: Using the Theories of Innovation to Predict Industry Change."
  • Interfaces for staying in the flow
    Psychologists have studied "optimal human experience" for many years, often called "being in the flow". Through years of study, the basic characteristics of flow have been identified. This paper reviews the literature, and interprets the characteristics of flow within the context of interface design with the goal of understanding what kinds of interfaces are most conducive to supporting users being in the flow. Several examples to demonstrate the connection to flow are given.
  • An Interview with Joichi Ito: The world wide blog
    Joichi Ito, founder of Neoteny and other Internet companies, finds that cyberspace is embracing it roots — collaboration, community, and personal communications — with bloggers leading the way.
  • S. Joy mountford on interface design
    The ultimate technology world will be soft, flexible and addressable. But the issues will remain the same, according to interface designer S. Joy Mountford: What do people like and what do people want?
  • Protecting intellectual property rights through information policy
    In today's electronic world, an organization's intellectual property is sometimes its biggest asset. Much time and money can be saved, and frustration and litigation avoided if company policy dictates ownership and use of intellectual property.
  • Ann Kirschner on marketing and distribution of online learning
    Outside of business schools, the very word "marketing" makes most universities uncomfortable, as does the idea of students as customers. But the world of higher education is becoming increasingly competitive. Fathom, named for the double idea of comprehension and depth, was a milestone in the evolution of online learning and a prototype of where things are headed.
  • An Interview with Steven Weber: Why open source works
    Author Steven Weber looks beyond the hype on Open Source. More than a self-governing utopia, it's a practical, sustainable way of organizing and innovating. Its method may soon be applied successfully in other sectors. Plus, a "crazy" idea for Microsoft.
  • An Interview with Jesse Poore: Correct by design
    Jesse Poore suggests a revolution in programming - holding software developers to the same level of rigor of training and workmanship as other professionals, developing software that's correct by design, and constraining the release of software-intensive products until they are scientifically certified as fit for use.
  • Roger Brent and the alpha project
    The work of a multidisciplinary genomic research lab in Berkeley may yield big changes in drug therapy and medicine. Roger Brent is President and Research Director of the Molecular Sciences Institute, an independent nonprofit research laboratory in Berkeley, CA, that combines genomic experimentation with computer modeling. The mission of the MSI is to predict the behavior of cells and organisms in response to defined genetic and environmental changes.
  • Calm technologies in a multimedia world
    In an ideal world, computers will blend into the landscape, will inform but not overburden you with information, and make you aware of them only when you need them.
  • Technology benefiting humanity
    Memo to the new generation of tech philanthropists: Apply the same intellect and discipline to your philanthropy as you employ in business.
  • Esther Dyson ... In focus
    Venture capitalist Esther Dyson is the chairman of EDventure Holdings, which publishes the influential monthly computer-industry newsletter Release 1.0 as well as the blog Release 4.0. The company also organizes the high-profile technology conference PC (Platforms for Communications) Forum, March 21-23, 2004. In this interview, she discusses her current interests, many to be covered at PC Forum. They include her investments, how to stop spam, outsourcing, and the overall high-tech industry environment.
  • An Interview with Peter Denning: The great principles of computing
    Peter Denning teaches students at the Naval Postgraduate School how to develop strategic, big-picture thinking about the field of computing. Denning, a past president of ACM (1980-82), has been involved with communicating our discipline, computing, to outsiders since 1970. Along the way he invented the working set model for memory management, developed the theory of virtual memory, promulgated operating systems theory, co-invented operational analysis of system performance, co-founded CSNET, and led the ACM Publications Board while it developed the Digital Library. He is an ACM Fellow and holds five major ACM awards. He just completed a five-year term as chair of the ACM Education Board.
  • An Interview with Thomas Kalil: Where politics, policy, technology and science converge
    From the White House to Berkeley, Thomas Kalil has worked on shaping the national agenda for science and technology research initiatives. Kalil, President Clinton's former science and technology advisor, now holds a similar post at the University of California, Berkeley, where he helps develop new research initiatives and increase UC Berkeley's role in shaping the national agenda.
  • Emotional design
    Beauty and brains, pleasure and usability go hand-in-hand in good design.
  • 2004, The turning point
    An overview of some of the issues that will change the way we use the Internet
  • An Interview with David Rejeski: Making policy in a Moore's Law world
    The accelerated rate of scientific discovery and technological innovation makes it difficult to keep up with the pace of change. What do policymakers know of nanotechnology and genetic modification? David Rejeski helps government agencies anticipate emerging technological issues.
  • Talking with security expert M. E. Kabay
    Adaptive attackers, novice computer users, indifferent management - it's no wonder our defensive mechanisms need continuous refinement.
  • Port wars
    In the not-too-distant-future, firewalls spark a battle over port regulation and ownership
  • The aeffability of knowledge management
    The Aeffability of Knowledge Management [1] The challenge of knowledge management, and hence of online learning, is to make it work with the complexity and richness of actual human communication.
  • Talking with Ben Chi of NYSERNet
    How the Internet began in New York State, the current state of Internet2, and the remote possibility of Internet3
  • Is child internet access a questionable risk?
    The unlimited and pervasive use of the Internet by young people raises many concerns about child safety. What solutions are available and why aren't they being used?
  • A whole new worldview
    Anthropologist Christopher Kelty on programmers, networks and information technology
  • Building an inventive organization
    A creativity expert distinguishes the concept of creativity from that of innovation and discusses how to create a corporate culture that really fosters creativity
  • A designing life: Blade Kotelly
    A speech-recognition software expert explains the difference between good design and ambiguity, how good designs go bad, and why everyone is a designer.
  • The Virtues of Virtual
    Abbe Mowshowitz talks about virtual organization as way of managing activities and describes the rise of virtual feudalism.
  • Commercial computational grids: a road map
    A consideration of the state of computational grids with respect to standards, current uses, and a road map for commercial benefit beyond their common applications
  • A model of democracy
    When can you have freedom, equality, moral reciprocity and a paycheck? Brook Manville on the surprising blueprint for organizational management.
  • Do you know what's in your project portfolio?
    Cathleen Benko and Warren McFarlan, authors of "Connecting the Dots: Aligning Projects with Objectives in Unpredictable Times" discuss the dangers of ignoring your IT portfolio.
  • Putting it all together with Robert Kahn
    The co-founder of the Internet recalls the non-commercial early days and looks at today's issues of fair use, privacy and the need for security.
  • Information access on the wide open web
    RLG's James Michalko discusses the issues surrounding the access and retrieval of scholarly information in today's environment of choice.
  • Learning by redoing
    The availability of components that do a myriad of tasks could lead programmer complacency
  • Talking with John Stuckey
    A conversation with the Director of University Computing at Washington and Lee University
  • Teaching the history of computer science
    Students who are truly interested in computer science would enjoy learning about those programmers who went before them, and how they overcame their difficulties.
  • The rise of the intelligent enterprise
    Mother Nature knows best -- How engineered organizations of the future will resemble natural-born systems.
  • Robert Aiken on the future of learning
    In the hands of skilled teachers, technology will provide students with the best possible education -- both face-to-face and distant, collaborative and individualized, and entertaining and instructional.
  • Inside PARC
    Johan de Kleer talks about knowledge tracking, smart matter and other new developments in AI.
  • Digital promises
    The prospect of living our lives online may not be so attractive after all
  • Channeling innovation
    Despite its importance to business, innovation can be a confusing distraction. An effective process for managing innovation allows organizations to respond to markets while remaining focused on business objectives.
  • The future of internet security
    Should common security technologies be blended with biometrics for accuracy and reliability?
  • Beyond numbers
    Martha Amram on the current economics of technology investment.
  • The somatic engineer
    Engineers trained in value skills will be superior professionals and designers.
  • Stamp out technology virginity
    Technology virginity and technology virgins are everywhere -- and more influential than you might like. Time to go on the offensive.
  • The new computing
    Ben Shneiderman on how designers can help people succeed.
  • Nowhere to hide
    Companies will need to make themselves components of their customers' lives rather than trying to make customers a component of their organizations. To do this, they need to stop kidding themselves when it comes to electronic integration.
  • Intel's inside track
    Annabelle Gawer on the surprising sources of leadership in interdependent environments.
  • Mastering leadership
    Richard Strozzi-Heckler on moving to the next level.
  • Sold!
    Ajit Kambil on the inevitable, strategic use of electronic markets and auctions.
  • Quantum leaps in computing
    John P. Hayes on the next killer app, entangled states, and the end of Moore's Law.
  • The privacy paradox
    A national biometric database in place of our current flawed identification systems could prevent the loss of liberty and autonomy.
  • A conversation with Ruby Lee
    Innovative computer scientist Ruby Lee talks about secure information processing, efficient permutations, fair use in the digital age, and more.
  • Computer science meets economics
    Yale's Joan Feigenbaum talks about the possibilities for interdisciplinary research, the new field of algorithmic mechanism design, and her radical views on security.
  • Talking with Erol Gelenbe
    An international perspective on ubiquitous computing and university education.
  • Bringing resources to innovation
    A ten-year study follows the venture capital business from relative obscurity to boom to retrenchment
  • Freedom to think and speak
    Under Microsoft's Digital Rights Management operating system, the ability to use information freely will be policed at the most intricate level.
  • Complexity in the interface Age: An Interview with Jeremy J. Shapiro
    Do you control technology or does it control you? Jeremy J. Shapiro talks about the power struggle in machine/human relationships and what it means today to be information-technology literate. Shapiro is a faculty member in the Human and Organization Development Program at the Fielding Graduate Institute.
  • Complexity in the interface age
    Do you control technology or does it control you? Jeremy J. Shapiro talks about the power struggle in machine/human relationships and what it means today to be information-technology literate. Shapiro is a faculty member in the Human and Organization Development Program at the Fielding Graduate Institute.
  • Optimizing bandwidth
    An approach to high performance distributed Web brokering.
  • What's in a name? Ask yahoo!
    A company's brand is one of its most valuable assets, one that few high tech companies -- most recently HP and Compaq -- understand how to leverage, according to Sam Hill. Hill is co-author (with Chris Lederer) of the new book, The Infinite Asset: Managing Brands to Build New Value. He is the former chief marketing officer at Booz Allen & Hamilton and currently a partner at Helios Consulting Group and also co-author of Radical Marketing, now in its fourth printing.
  • What is software engineering?
    The name implies scientific rigor, and opens software engineering to the charge that it is a pseudo-science flying under false colors.
  • Think globally, act strategically
    John Parkinson relays the challenges for a global financial services firm including anticipating technologies, winning the war for talent, and finding innovative ways to maintain a corporate presence in a worldwide market.
  • Richard Leifer on Radical Innovation
    A group of six faculty members of Rensselaer Polytechnic Institute's Lally School of Management and Technology began work on something they called the Radical Innovation Research Project in 1994, focused on finding out how game-changing innovation occurs in established, mature organizations. The results of the project are reported in a new book from Harvard Business School Press entitled "Radical Innovation: How Mature Companies Can Outsmart Upstarts" and written by the six faculty members: Richard Leifer, Christopher M. McDermott, Gina Colarelli O'Connor, Lois S. Peters, Mark P. Rice, and Robert W. Veryzer. To find out more about the project, we talked with Richard Leifer, who has been at RPI since 1983 and whose academic specialties include organizational behavior, high-performance management, and leadership.
  • Tomorrow's news
    What will the news be like in the continuing evolving age of information? Ifra, an international association of newspaper organizations, has created a 10-minute video to provide a peek into the future of the newsroom while highlighting issues that present-day newsrooms must face in making the transition to the new publishing industry. Although early prototypes of the Daily Me have proven disappointing, "it cannot be ignored that the news industry, like most industries, is moving from a product-based business model to a service model under pressure of the Information Economy," writes Ifra's Kerry Northrup. "And good service requires some degree of personalization. The publisher of Tomorrow's News has profiled its readers, listeners, viewers and users sufficiently that it knows their collective interests and even individually where they work." Northrup, who plays a prominent role in Ifra's work as a worldwide leader in publishing strategies and technology, here provides Ubiquity a summary description of what lies ahead for the newspaper industry in the 21st Century.
  • Guide to the internet
    No matter where on earth, it isn't hard to find creative individuals who see the advantages that technology can confer.