Crashes and Traffic Jams in Military Test of Robotic
Vehicles
New York Times (11/05/07) P. A18; Markoff, John
The DARPA Urban Challenge was completed on Saturday, Nov. 3, with the
Carnegie Mellon team taking home the first place, $2 million prize.
Although the Carnegie Mellon car and a few others successfully completed
the race, the event highlighted how much farther autonomous vehicles need
to advance before becoming a practical solution. While the most
sophisticated entries were able to successfully navigate the simulated city
environment, others, including the Land Rover entry from the Massachusetts
Institute of Technology, were involved in several close calls and
collisions. The MIT car collided with a Passat built by researchers from
Braunschweig, Germany, an Oshkosh military vehicle came within inches of
crashing into a pillar, and the car from the University of Central Florida
was eliminated after crashing into an abandoned building. The DARPA Urban
Challenge was the third event in a series of races instituted following a
2004 directive from Congress that requires the military to replace a third
of its logistical vehicles with robots by the middle of the next decade.
DARPA project manager Norman Whitaker says the goal is to better protect
soldiers on the battlefield. During the race, the robotic cars were
required to perform a variety of tasks, including make left-hand turns
across oncoming traffic and pulling into and out of tight parking spaces.
Whitaker says the DARPA challenges are having a Sputnik-like impact on the
engineering and computer science departments at many universities, which
has seen increases in enrollment.
Click Here to View Full Article
to the top
Voting Out E-Voting Machines
Time (11/03/07) Padgett, Tim
Although they were once considered the solution to outdated paper-based
voting systems, electronic and touch-screen machines have come under
intense scrutiny, and a new bill in Congress would ban DRE machines in
federal elections starting in 2012. "We have to start setting a goal on
this," says Sen. Bill Nelson (D-Fla.), who introduced the bill with Sen.
Sheldon Whitehouse (D-R.I.). "Voters have to feel confident that their
ballot will count as intended." Trust in electronic voting is eroding as
controversies over the accuracy of the machines are mounting. As a result
of a tainted election in 2006, Florida Republican Governor Charlie Crist
mandated the state return to an optical scan system in which votes are
marked on a sheet, which is kept for auditing, and electronically scanned.
Optical scanning is considered far more accurate, and is favored by the
National Institute of Standards and Technology, which advises the U.S.
Election Assistance Commission. The Nelson-Whitehouse legislation would
also require routine audits of at least 3 percent of the precincts in all
federal elections, and would possibly mandate that all voting machines
create a paper trail as early as the 2008 election. Some worry that many
states may not be able to incorporate paper trail technology by the 2008
election, but Dan McCrea, head of the Florida Voters Coalition, believes
that not only is it feasible but that it is also vitally important. "It
will be a challenge, but the voter fairness issue involved here is too
important," McCrea says.
Click Here to View Full Article
to the top
DARPA Looks to Adaptive Battlefield Wireless Nets
Network World (11/01/07) Cox, John
The goal of Project WAND (Wireless Adaptive Network Development) is to
create a tactical radio network that connects soldiers to each other on the
battlefield through the use of low-cost, off-the-shelf radio components and
various software methods and algorithms, as part of an overarching Defense
Advanced Research Projects Agency effort to develop the Wireless Network
after Next infrastructure for military communications. WAND will use
adaptive spectrum management, which allows radios to seek out and tap
unused bands for communications through continual radio spectrum analysis
and dynamic spectrum access. Contractors working on Project WAND include
Tyco Electronics and BBN Technologies, and BBN's work in disruption
tolerant networks will be harnessed in another part of the WAND software
stack. The project aims to place as many as 10,000 nodes within a
relatively small area, and BBN scientist Jason Redi says, "Managing that
connectivity is a really difficult thing because links will be changing all
the time." An initial WAND technology demonstration is slated for January,
while a second is scheduled for next September. The project's software
advances form part of an expanding global research and development
initiative to produce "cognitive radios" capable of choosing the proper
radio waveforms, frequencies, and protocols to maximize efficiency,
reliability, and performance.
Click Here to View Full Article
to the top
IT Salaries to Rise Twice as Fast as Inflation
CIO Insight (10/30/07) Perelman, Deborah
CIOs hiring skilled IT professionals will pay on average 5.3 percent more
in 2008 than in 2007, concludes a new Robert Half Technology report. Lead
developers who manage software development teams and projects will see the
biggest increases, with base compensation expected to rise 7.6 percent,
averaging between $80,250 and $108,000 annually. Application architects
will also see a significant increase, with an average 7.5 percent increase
and starting salaries ranging from $87,250 to $120,000. Web development,
network management, and database administration are also expected to see
salary increases of 7 percent or higher. "This was not really a surprise,"
says Robert Half Technology's Katherine Spencer Lee. "The strong increases
are still in the application development space, especially for individuals
that have those Web 2.0 skill sets. Those who can architect and develop Web
spaces had the highest increases that we saw, even 7.5 percent in some
titles." The study also found that nearly 15 percent of firms interviewed
said they plan on increasing their staff in 2008. Wireless communication
is one of the top areas driving IT job growth, largely because developers
continue to create more tools for mobile devices that IT departments need
to be able to support, which Lee calls the gadget factor. "With everyone's
devices communicating with everyone else's devices, there is a need for
people who are like the air traffic controllers of the IT department," Lee
says. The report says strong demand for IT professionals exists in the
financial services, health care, and commercial construction
professions.
Click Here to View Full Article
to the top
The University's Role in Advancing Data Encryption, Part
2
TechNewsWorld (11/01/07) Burger, Andrew K.
Technological innovations, new legislation and regulations, and pressing
security needs are factors driving the increase of collegiate and
university encryption technology research, and areas that industrial and
academic investigators are considering as possible application centers
include nanotechnology, quantum cryptography, and the supply chain.
CipherOptics' Jim Doherty says network performance is an important area
that should not be ignored, even as most encryption research efforts are
focused on the creation of more robust encryption algorithms. "Today's
high-performance networks must be able to meet the latency requirements of
delay-sensitive applications such as voice and video over IP," he notes.
"While there may be a niche market for security over performance types of
solutions, broad adoption of new encryption algorithms will be determined
by speed as much as they are by security." Such is the nature of research
being conducted by the Rochester Institute of Technology's networking,
security, and systems administration department, whose researchers conclude
that the selection of a wide-area network encryption solution involves
consideration of not just performance, but also how well the technology
satisfies organizational requirements and other non-performance related
factors. Higher education institutions such as Southwestern Illinois
Community College are incorporating encryption into their courses and
curricula and using it to safeguard data on campus. The college's
Christine Leja says the possibility of making a data assurance course with
an encryption component a required course is under discussion. She points
out that colleges and public and private sector organizations are also
being spurred to deter identity theft and find applications for encryption
technology by new legislation and the introduction of payment card security
standards. "Higher education provides open and secure access for its
students, and encryption offers a clear path to secure sensitive data and
support an open, mobile environment," she says.
Click Here to View Full Article
to the top
Digital Eyes in the Sky Play Key Role in Battling Flames
in Southern California
National Science Foundation (10/30/07)
The High Performance Wireless Research and Education Network (HPWREN) has
enabled fire crews and residents in the San Diego area to obtain real-time
video and still images of fires that have beset the region. The
NSF-supported local network has a number of remote cameras perched in the
mountains and bluffs overlooking the region. "The HPWREN real-time cameras
tell us what is happening before engines or chiefs can get there," says Tom
Gardner, emergency command center chief of the California Department of
Forestry and Fire Protection (CAL FIRE). Residents of rural communities
such as Jamul have also set up blogs that encourage people to check out
HPWREN for video, images, and other information that traditional media may
not offer. "I've heard from many Jamulians about the cameras, all with
basically the same message: the cameras were what kept them sane," says
local resident Tom Dilatus. HPWREN is a non-commercial prototype of a
high-performance, wide-area wireless network covering San Diego and
Riverside counties. The network is used to provide wireless networking
technologies in emergencies, high-speed Internet access to field
researchers and network analysis research, and to enhance educational
opportunities for Native Americans in rural areas.
Click Here to View Full Article
to the top
Argonne Plans Double Dose of Computing
Chicago Tribune (11/01/07) Van, Jon
Argonne National Laboratory recently announced a deal with IBM to acquire
two new supercomputers, which will be linked together to act as one.
Argonne will get the 445 teraflop Blue Gene/P system, which will be
combined with a slower Blue Gene/P system currently being installed. When
the two systems are linked they will operate at 556 teraflops.
Additionally, Argonne will continue to operate an older Blue Gene/L system
that runs at 5.7 teraflops. Argonne computer scientists will also provide
feedback to help IBM design future computers. Computer time at Argonne is
primarily focused on academic research, but also includes time for
industrial scientists and efforts to develop insights into fundamental
processes such as the formation of soap bubbles or the combustion of jet
fuel to make lower emission jet engines. Though the specifics of the deal
have not been disclosed, part of the arrangement includes a collaboration
to develop more open-source software for Blue Gene machines to help expand
the applications available. "Programmers have usually been taught to write
for a single computer or a few," says IBM chief technology officer of high
performance computing and software Dennis Quan. "They're not taught to
write for tens of thousands of machines. But levels of parallelism and
complexity are advancing to where in a few short years, this will be very
mainstream."
Click Here to View Full Article
to the top
MIT Works Toward 'Smart' Optical Microchips
MIT News (11/01/07) Trafton, Anne
"Smart" optical microchips that adapt to different wavelengths of light
could emerge from a new theory developed by MIT postdocs Peter Rakich and
Milos Popovic, and telecommunications could be substantially advanced
through photonically powered micro-machines. An MIT research team
demonstrated earlier this year that photonic circuitry could be combined on
a silicon chip by polarizing all of the light to the same orientation, and
the current research shows how tiny mobile machines can be constructed on
such chips, exploiting the pressures exerted by photons as they bombard the
walls of a cavity. In combination with ultrapure laser light, this photon
bombardment causes radiation pressure to accumulate, and the researchers
suggest that machines built from extremely small ring-shaped cavities on
the surface of the chip can be driven by this pressure. A unique way of
processing data routed through fiber-optic networks is one potential
application of this concept. Existing fiber-optic resonators must be
synchronized with the incident light to ring at its frequency, while the
MIT theory could lead to a "smart" resonator capable of chasing the
frequency of the laser light incident upon it. "Our objective now is to
develop a variety of light-powered micro- and nanomachines with unique
capabilities enabled by this technology," Popovic says. "But the first
step will be to demonstrate the concept in practice."
Click Here to View Full Article
to the top
Technology Tunes Into Our Emotions
ABC Science Online (Australia) (10/31/07) Cooper, Dani
An Australian computer scientist is developing a computer system that
would be capable of recognizing anxiety in people by analyzing their speech
and facial expressions. The technology uses speech recognition software to
monitor speech rhythm, pitch, and any quavering in the voice; artificial
neural networks are used to track facial expressions. Gordon McIntyre, a
Ph.D/ student at Australian National University, is mapping 65 points on
the face that change due to the emotional state of a person, such as the
eyebrows, lips, and nose, which would enable him to compare an average or
expression-free face to an anxious face. "We build up an average shape of
a face from a database," McIntyre says. "And then measure the difference
between an average face and one that is subject to the emotion." McIntyre
does not have any anxious face samples, so he is working with psychology
colleagues to create a template image of the emotion, which would be added
to the database. The technology could serve as a tool for those who care
for the elderly, and could also be used as a driver safety application.
Click Here to View Full Article
to the top
UMass Researchers Describe New Approach to Tag
Security
RFID Journal (11/01/07) O'Connor, Mary Catherine
University of Massachusetts Amherst researchers have discovered a way of
securing RFID tags against tag cloning and fraud. Passive RFID tags
contain volatile memory composed of memory cells, a circuit representing a
single piece of data. When a RFID scanner powers up the tag, the chip's
memory cells produce a unique fluctuating electrical pattern before
creating the ID and any other information stored on the chip. The
fluctuating electrical pattern is unique to each RFID tag and can be used
to authenticate the tag the next time it is scanned. The pattern can also
be used to encrypt the tag's encoded data, securing it against being read
by an unauthorized scanner. An end user could apply a tag's unique pattern
to a randomness extractor as part of a hash cryptography process, creating
a string of random numbers that could be used to generate keys to decrypt
the stored tag data. Reading the data could be done with any scanner, but
changing the data would need to be done with specialized software to
generate the keys needed to decrypt the data. Only RFID tags that use
volatile, static random access memory generate the pattern, meaning EPC Gen
2 tags, widely used in supply chain applications for tracking and tracing
products, are unable to use the same security system as they use
nonvolatile, electrically erasable, programmable, read-only memory chips,
which are less expensive than volatile memory.
Click Here to View Full Article
to the top
Australian IT Skills Shortage Here to Stay
Computerworld Australia (10/29/07) Hendry, Andrew
Australia is currently in the middle of the worst IT shortage in its
history, and the situation is only going to worsen unless the industry and
the government act quickly, warn industry experts. The shortage is caused
by several factors, including the strong economy and low unemployment,
global competition for talent, an increasing dependency on technology in
all businesses, increased spending on technology, and the overall image of
the IT profession. Australian Information Industry Association CEO Sheryle
Moon says Australian companies also have to compete with the United States
and Europe, which offer IT professionals a higher wage, which makes it
difficult to attract skilled foreign IT professionals. Meanwhile,
Australia is experiencing declining rates in IT education enrollment.
Phillip Tusing of IT recruitment group Greythorn believes that public
opinion of the IT industry may already be changing for the better, though
improvement is still needed. "There is a lot more understanding of IT
professionals and what they do," Tusing says. "At the same time,
representation of females in the IT workforce is still too low." A
shortage in networking skills is particularly concerning. Companies such
as Cisco are worried that the growth of Web 2.0 and unified communications
will add to the skills shortage. A Cisco-commissioned study found that
Australia is short 6,000 skilled workers in the networking industry alone.
"We are seeing a lot of business driven by Web-related technology, and of
course, at the heart of the Internet lies 'networking' both as a business
model and technology," Tusing says. "The demand for networking skills will
continue as Internet-based communications reach critical mass."
Click Here to View Full Article
to the top
Tabulator Redux: Writing Into the Semantic Web
University of Southampton (ECS) (11/02/07) Berners-Lee, Tim; Hollenbach,
J.; Lu, Kanghao
The Semantic Web has a dual-level architecture that features a "web" of
directed, untyped links between documents and a "graph" of directed, typed
links between things described in the documents, and the goal of the
Tabulator project is to make users of the interface capable of effective
engagement with co-workers by investigating, analyzing, and collaboratively
co-authoring the shared graph of knowledge. The authors concentrate on
delivering the write side of the readable/writable Web in the latest
iteration of the Tabulator project. They permit the natural modification
and insertion of information within the browsing interface, and communicate
revisions to the server triple by triple for least possible brittleness.
The outline mode of the tabulator supports three categories of
editing--object modification, addition of a new object with an existing
predicate, and addition of a new predicate/object pair for an existing
subject. Among the remaining challenges is the propagation of changes by
collaborators back to the interface to generate a shared editing system.
The collaborative aspects of the system must also be augmented, as well as
user interface usability. To support writing across Semantic Web
resources, several technologies--a HTTP/SPARQL/Update-based protocol
between an editor and incrementally editable resources stored in an open
source "data wiki" among them--have been contributed.
Click Here to View Full Article
to the top
Trashed Silicon Wafers Find a Place in the Sun
TechNewsWorld (10/30/07) Aun, Fred J.
The Semiconductor Industry Association reports that 3.3 percent of the
approximately 250,000 silicon wafers produced each day worldwide are
discarded. However, IBM has developed a silicon reclamation process to
salvage the discarded wafers and reuse them. The process that removes
circuitry from the wafers also offers a complete method of erasing their
intellectual property. IBM plans to initially use the wafers for equipment
calibration and testing, but eventually they will be sold to the solar
panel industry. IBM says solar panel makers could see energy reductions of
as much as 90 percent by using reclaimed silicon. Meanwhile, recycling
silicon could provide a boost to the solar energy industry, since a severe
shortage of silicon threatens to stall the industry's growth, says Charles
Bai of ReneSola, a Chinese solar energy company. "This is why we have
turned to reclaimed silicon materials sourced primarily from the
semiconductor industry to supply the raw material our company needs to
manufacture solar panels," Bai says.
Click Here to View Full Article
to the top
Agile Process Showcased at IT Architect Event
Campus Technology (10/25/07) Mackie, Kurt
At the International Association of Software Architects' first annual IT
Architect Regional Conference for Southern California, IASA Fellow and IBM
Rational practice leader for agile development Scott Ambler delivered a
keynote address on the agile software development process, which he
anticipates will become the standard by the end of the decade. He
mentioned a March 2007 agile development survey by Dr. Dobbs Journal that
found that 69 percent of respondents said their organizations were
undertaking at least one agile project, and he derided the practice of
following repeatable processes in software development as foolish. Ambler
also cited Standish Group data that 45 percent of software functionality is
not employed on successful software development initiatives as an argument
for going agile, and noted that 80 percent of business stakeholders do not
desire the software that is written to spec. An August Dr. Dobbs survey
identified key elements of successful software development projects as
determined by business stakeholders, such as software delivered when ready
rather than on schedule; software produced under budget and with return on
investment; software quality that fulfills requirements irrespective of
time and budgetary limits; and software that meets business stakeholders'
needs. Ambler pointed to the low probability that business stakeholders
and project managers will reach a consensus on a software project, and
among the model strategies agile development involves is a holistic
perspective, active business stakeholder participation in creating
applications, and the combination of just-in-time modeling with testing.
Although governance is essential to agile development success, Ambler
cautioned that developers must be given the proper motivation, and a
well-entrenched IT governance process is especially critical as the number
of IT projects mounts.
Click Here to View Full Article
to the top
Math on Fire
Science News (11/03/07) Vol. 172, No. 18, Rehmeyer, Julie J.
The crucial role climate plays in the spread of wildfires is the basis of
a new predictive supercomputer model of fire behavior being developed by a
team of mathematicians and scientists that blends fire and weather
patterns. The researchers are working on ways to update the model with the
most recent fire and weather data every 30 minutes in order to improve the
model's accuracy, and this requires new techniques for rapid, on-the-spot
data collection. The National Science Foundation has earmarked $2 million
for the project. The scientists are constructing software capable of
automatically processing data from airborne thermal and infrared sensors,
while another area of research are minuscule, autonomous fire detectors
equipped with radio transmitters, GPS, and sensors for measuring smoke,
temperature, and other factors that can be deployed into a wildfire area by
aircraft or firefighters. These various devices would communicate their
data remotely to a supercomputer, which would match it against the latest
weather information to predict the likely direction and speed of the fire's
proliferation. This data would then be sent to handheld computers carried
by firefighters in the field. The initial version of the model is
complete, and is now being tested on historical fire data.
Click Here to View Full Article
to the top
Reading the Future
University of Victoria (11/01/07) Cador, Jennifer
University of Victoria English professor Ray Siemens believes that within
one generation the majority of reading will be done online, including
full-length books. Siemens is leading a multidisciplinary group of
researchers dedicated to developing a new way of reading, essentially
striving towards the electronic book. Siemens says the electronic book
would simply be an extension of exiting trends because so much reading
already take place online. Online reading has not yet including books
however, largely because of the computer's lack of portability and because
reading from a computer screen is harder on the eyes than reading from a
book. While laptops are highly portable, they are not convenient for
reading on a bus or at the beach. Additionally, many people like
experience of reading a book. Siemens and his research group are
considering all of these factors as part of their effort to develop online
books. Creating new technology, such as gentler monitors and e-book
readers will address part of the problem, while cultural expectations will
likely change as technology advances. Siemens believes that electronic
reading will make reading a far more active experience. "What the future
looks like is not a single book in isolation, but a book integrated with
everything else on the Internet," he says. "The key is figuring out how to
present it in a form we're all comfortable with."
Click Here to View Full Article
to the top
A New Approach to Search
Business Communications Review (10/07) Vol. 37, No. 10, P. 19; Weinman,
Joe
A new Web search architecture that exploits aggregate network data,
metadata, and statistics to better weight search results according to
relevancy, augment the scope of search to include the "Deep Web," and lower
the incidence of click fraud is presented by strategy and emerging
technologies speaker Joe Weinman. Classic search strategies generally
leverage link analysis and devalue real-world traffic as a ranking
criterion, while Weinman's method emphasizes such traffic. "Network
traffic statistics such as unique visitors, interval between visitor
arrival at a page or site and departure from a page or site, packets
transferred, subsequent clicks from a page versus reloads of prior pages,
clicks leading to other pages within a site, and similar types of measures
could be an excellent indicator of average user interest in a page or site,
which in turn is a proxy for relevance," Weinman writes. "Moreover, if the
data were collected by a network service provider on a statistically
sampled basis and/or with outliers discarded, it would be very hard to
spoof." Additionally, a network-based search approach could enhance Web
crawling because a network service provider could automatically glean index
data and acquire that data faster, while traditional and network-based
crawling strategies could be complementary. Click fraud could be better
detected and ameliorated by Weinman's approach. Client browsers, search
portals, and network service providers are the chief candidates for
delivering optimal search results rankings, and Weinman speculates that the
third candidate could be best suited for the job.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top