FCC and Vendors are Developing Wireless 'White Space'
Devices
InformationWeek (08/10/07) Gardner, W. David
The FCC and a coalition of high-tech companies led by Google and Microsoft
are hoping that "white space devices" in smart phones and laptops will be
able to take advantage of the TV spectrum switchover from analog to digital
in February 2009. The white space devices (WSD) are designed to use
unregulated TV frequencies, primarily to deliver inexpensive wireless
broadband connections to rural areas. The two organizations vowed to find
solutions to interference and other problems that prevented the devices
from obtaining approval, and the industry coalition indicated that it would
work to develop new WSDs that can pass the FCC's tests. The FCC suggested
that the coalition's first attempt at obtaining approval might have been
premature. "The devices we have tested represent an initial effort, and do
not necessarily represent the full capabilities that might be developed
with sufficient time and resources," the FCC says in a report on the WSDs.
"Accordingly, we are open to the possibility that future prototype devices
may exhibit improved performance." Additional members of the coalition
include Dell, EarthLink, Hewlett-Packard, Intel, and Philips.
Click Here to View Full Article
to the top
Yahoo Exec Gets ACM SIGKDD Honor
Silicon Valley/San Jose Business Journal (08/10/07)
ACM's Special Interest Group on Knowlegde Discovery and Data Mining
[SIGKDD) presented Yahoo's chief data officer and executive vice president
Usama Fayyad with its 2007 Innovation Award. According to the SIGKDD
citation, Fayyad "made major contributions to the advancement of the data
mining and knowledge discovery field, including machine learning and data
mining algorithms that scale to large commercial database systems and the
development of fundamental applications in mining massive science data sets
that have led to significant new scientific discoveries." Yahoo lauded the
ACM award as being the highest honor for individuals involved in data
mining and knowledge discovery. To read the citation that accompanies this
award, see
http://www.acm.org/sigs/sigkdd/awards_innovation.php#2007i.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
'Virtual Sandboxing' Provides Safe Security
Testing
Computerworld (08/09/07) Hines, Matt
The number of threats Internet users have to face continues to grow, but
security researchers at the Usenix Security Symposium presented a new
process for protecting users with execution-based malware detection.
University of Washington graduate student Alexander Moshchuk demonstrated a
tool that uses a "virtual sandbox" to test Web applications for suspicious
behavior before allowing the application to reach the end-user browser.
Several other techniques have been developed that can protect end-users
from vulnerabilities that have not been identified or patched.
Virtualization is being adopted by many researchers to identify unknown
vulnerabilities, and Moshchuk pointed out a tool created at the University
of Washington called SpyProxy. SpyProxy is injected as a virtual machine
that sits between an end-user's browser and a Web site to download and test
any application the browser is trying to access to catch any potential
attacks before they reach the browser. SpyProxy's virtual machine mirrors
the browser being used by someone running the tool and renders any page or
application that is accessed to see if the URL contains an attack.
Moshchuk says SpyProxy can effectively run and analyze any type of Web page
or application in a few seconds to determine if it contains any patterns
common in many threats. SpyProxy does have some limitations, as it works
more effectively on sites that contain larger volumes of static content,
such as text, and sometimes it has difficulty determining when a page has
finished loading, which can add to the delay. The University of Washington
team says SpyProxy is capable of monitoring multiple users on clusters of
workstations, and a single-CPU device can process about 82,000 page
requests in one day, which should cover about 800 users per machine. The
researchers plan to distribute the SpyProxy program free of charge.
Click Here to View Full Article
to the top
Wi-Fi in the Sky Coming Next Year
NewsFactor Network (08/09/07) Long, Mark
American Airlines has partnered with communications provider AirCell in a
effort to provide an onboard Wi-Fi systems on transcontinental flights
starting in early 2008. The broadband service is expected to allow
travelers the ability to check email, search the Web, access corporate
intranets, and monitor news through any Wi-Fi capable laptop, PDA, or
smartphone for about $10 per flight, though no official price has been set.
Promises for Internet-connected flights have fallen through in the past,
but IDC research director Rena Bhattacharyya says she believes Wi-Fi will
succeed because airlines are feeling more pressure from passengers that are
used to almost universal Internet availability. Wi-Fi has also become a
more affordable option for airlines, according to IDC research manager
Godfrey Chua. "With respect to the wireless infrastructure, the pace of
technological change has increased, and the cost per bit has gone down
substantially from what it was just a few years back," says Chua. "If we
take a look at the traditional cellular system, in the last five years the
cost of a GSM base station has come down by 50 percent." Wireless
frequencies are also able to carry a greater amount of information than
ever before. AirCell's air-to-ground network will use a series of cellular
towers to communicate with planes, which will have three different
antennas.
Click Here to View Full Article
to the top
National Science Board Approves National Action Plan for
21st Century Stem Education
EurekAlert (08/10/07)
The National Science Board is seeking public comment on its draft of a
national action plan for improving the STEM (science, technology,
engineering, and mathematics) education system. The board will take
feedback from the public into consideration as it settles on a final
version of the national action plan, which largely focuses on fleshing out
coordination across states as well as grade levels, and building up the
ranks of qualified K-12 STEM teachers. The proposal offers strategies for
fostering collaboration between local, state, and federal governments and
nongovernmental STEM education groups, with hopes of increasing the number
of STEM education workers. One recommendation is to use the new
congressionally chartered non-federal National Council for STEM Education
to coordinate the activities of federal STEM education programs. The board
also sees improving STEM education programs at the National Science
Foundation as a way of enhancing the competitiveness of the U.S. workforce.
The final version of the national action plan is scheduled to be released
on Oct. 3, 2007.
Click Here to View Full Article
to the top
Render Smoke and Fog Without Being a Computation
Hog
Jacobs School of Engineering (UCSD) (08/09/07) Kane, Daniel B.
University of California San Diego computer scientists have developed a
way to generate smoky, foggy, and smoggy scenes without monopolizing and
slowing down processing power, such as previous computer graphic models do
when trying to create the same scenes. UCSD computer science Ph.D.
candidate Wojciech Jarosz, who led the study, says the new graphics
generator creates a huge computational savings and allows users to render
explosions, smoke, and architectural lighting in hazy conditions much
faster. Currently, rendering computer graphics with participating media,
or elements that absorb or reflect part of the light such as smoke, fog, or
clouds, generally requires significant computational power and time. To
reduce computing time for hazy effects, Jarosz developed a method called
"radiance caching." The program divides the hazy effect into numerous
small, overlapping, circular sections. As the light travels from the
object being viewed to the viewer, the program calculates the effect each
of the small sections has on the light. The information created for those
sections is saved and used for light traveling from nearby points that
intersect the same sections. Radiance caching can also identify and use
previous computer lighting values, saving even more time and processing
power. "Our approach handles both heterogeneous media and anisotropic
phase functions, and it is several orders of magnitude faster than path
tracing. Furthermore, it is driven and well suited in large scenes where
methods such as photon mapping become costly," the researchers wrote in a
paper presented at SIGGRAPH.
Click Here to View Full Article
to the top
Location, Semifinalists Set for Urban Robot Race
CNet (08/09/07) Olsen, Stefanie
The Defense Advanced Research Projects Agency has selected a military
training facility in Victorville, Calif., as the site for the qualifying
and final rounds of the 2007 Urban Grand Challenge robot race. The
qualifying round is scheduled for Oct. 24 and the finals for Nov. 3 at the
George Air Force Base. In the past, DARPA did not announce the location
months in advance so as not to give any participants a competitive
advantage. The agency announced 36 semifinalists to program cars to race
through mock city streets while adhering to the state's traffic laws,
including Stanford Racing Team, which won the 2005 Grand Challenge with a
modified Volkswagen SUV. Stanford will be challenged by Team Oshkosh
Truck, which has a 16-ton Oshkosh Truck; Team Gray, a Grand Challenge
finalist; and Carnegie Mellon's Tartan Racing, which will have a modified
Chevy Tahoe. The winner will receive $2 million, second place will take
home $1 million, and third place will receive $500,000. "The vehicles must
perform as well as someone with a California driver's license," says DARPA
director Tony Tether. "The depth and quality of this year's field of
competitors is a testimony to how far the technology has advanced."
Click Here to View Full Article
to the top
Flash Memory Replacement Coming This Year?
CNet (08/09/07) Kanellos, Michael
The general consensus among attendees at the Flash Memory Summit is that
Intel and STMicroelectronics, which formed a joint venture to make memory,
may soon start producing phase-change memory, a potential replacement for
flash memory. Phase-change memory, also known as ovonics, is made from
material similar to CD discs and is said to be more dense than flash
memory. The material is made into chips; microscopic bits on the chip are
then rapidly heated to about 600 degrees Celsius, which changes the bits'
crystalline structure into an amorphous structure. Samsung, Phillips, and
others have worked on developing phase-change memory, though Intel and
STMicroelectronics, under the joint-venture name Numonyx, may be the first
to release a product, as industry observers believe that the two may soon
outline plans to commercially release phase-change memory chips later this
year. Intel has made no official comment on when it expects to release
phase-change memory, but Greg Komoto, manager of strategic planning for
Intel's flash memory group, says Intel has created samples of 90-nanometer
phase-chips, which Intel believes could replace NAND flash. Eli Harari,
CEO of SanDisk, one of the primary manufacturers of NAND flash, says NAND
flash will start to hit a wall in about seven years.
Click Here to View Full Article
to the top
New Search Engine Ranks Tables by Title, Document
Content, Text Reference
Penn State Live (08/09/07)
Penn State researchers have a search engine that is capable of indexing
and ranking tables based on their title, text references to the table, and
date of publication. The search engine, known as TableSeer, is also able
to identify and extract tables from PDF documents, and is able to identify
and consider how frequently a document is cited when ranking search
results. "TableSeer makes it easier for scientists and scholars to find
and access the important information presented in tables, and as far as we
know, is the first search engine for tables," says Prasenjit Mitra, an
assistant professor in the Penn State College of Information Sciences and
Technology and one of the lead developers of TableSeer. Being able to
quickly identify and rank tables could be an extremely useful tool to
scientists and researchers. A search of 10,000 documents from journals and
conferences showed that more than 70 percent of papers in chemistry,
biology, and computer sciences included tables, and most documents had
multiple tables. Some software can identify and extract tables from text,
but no other search engine can scan for tables across documents, meaning
scientists and scholars need to manually look through documents to find
tables. TableSeer can automatically find tables, capture the table's data
and footnotes and titles, and allow users to search for a particular column
in a table. Mitra says TableSeer correctly identified and retrieved 93.5
percent of tables created in text-based formats during tests with documents
from the Royal Society of Chemistry. Continuing research on TableSeer will
strive to improve the ranking algorithm and add additional features. The
researchers are also working on a search engine that can identify, extract,
and rank figures found in documents. The researchers plan to make
TableSeer's source code available as the project nears completion.
Click Here to View Full Article
to the top
Tiny 'GlowBots' Hint at Future of Robotics
Discovery News (08/02/07) Steadter, Tracy
European ECAgents project researchers are examining how robots interact
with each other and with their owner. The robots, called GlowBots, are
small, round robots about the size of a coffee mug. Each one has eight
infrared sensors, 148 light-emitting diodes, a camera, microphones, a
computer, and a Bluetooth connection. The GlowBots "express" themselves by
displaying intricate patterns of flashing lights. Viktoria Institute
Future Applications Lab research assistant Mattias Jacobsson says
interacting with a GlowBot would be less like the interaction between a
person and a dog or a cat and more like interacting with a pet spider or
lizard. The purpose of the project is to see if the interactions the
robots have with humans, and each other, could lead to unconventional roles
for future devices, like machines that guide a person through an airport or
heighten the experience on an amusement park ride.
Click Here to View Full Article
to the top
The Bytes and the Bees
PC Magazine (08/07/07) Vol. 26, No. 15, P. 18; Groc, Isabelle
Technology is enabling researchers to learn more about nature, with hopes
of applying such insights to human problems. "The advent of computer
modeling capabilities and the arrival of nanotechnology allow us to
interrogate the wisdom that nature displays," says John Pietrzyk, founder
of Biomimetic Connections, a provider of information on bio-inspired
intellectual properties. With biomimetics, researchers hope to develop
computer technology that will be able to learn, adapt to change, and
protect and repair itself. Biomimetics' influence already can be seen in
developments such as IBM's Airgap Microprocessor, which was inspired by the
self-assembly methods of snowflakes. An algorithm that optimizes Internet
servers is based on honeybee colonies, and Melanie Mitchell, a computer
science professor at Portland State University, is using the theory of
natural selection to determine the best search parameters for multimedia
searches. However, biomimetics is more than just transferring nature's
code to an engineering environment, says Julian Vincent, director of the
Center for Biomimetic and Natural Technologies at the University of Bath in
England. "We should understand the way biology does its engineering and
then replace current engineering with the biological version," says
Vincent.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
How Green is IT's Future?
eWeek (08/08/07) Preimesberger, Chris
Although industry experts are predicting the rise of self-contained data
centers that no longer require cooling, the EPA's Energy Star program
recently outlined energy challenges the IT industry will also have to face.
The EPA estimates that the IT industry consumed about 61 billion
kilowatt-hours in 2006, which is about 1.5 percent of the United States'
total electricity consumption and cost about $4.5 billion. An EPA report
submitted to Congress predicts that the IT industry's power consumption
could nearly double by 2011. Federal servers and data centers accounted
for about 10 percent, 6 billion kwh, or some $450 million, of the total 61
billion kwh. The EPA also provided Congress with some suggestions on what
the IT industry could do to improve their power consumption, including
establishing standardized metrics for data centers, creating an Energy Star
performance rating system, and offering financial incentives such as tax
credits and utility rebates. The EPA also commended technology companies
for their efforts to make data centers more efficient, which includes
increasing processor performance without increasing power consumptions and
applying combinations of hardware and software to make servers and
computing devices more efficient.
Click Here to View Full Article
to the top
Robofin Could Help Naval Ships Find Mines
InformationWeek (08/06/07) K.C., Jones
Researchers at the Massachusetts Institute of Technology (MIT) have cited
the common bluegill sunfish as the inspiration for a mechanized fin that
could be used to propel robotic submarines and other types of autonomous
underwater vehicles (AUVs). AUVs equipped with a mechanical fin would not
need propellers, and the fin would enhance AUVs' ability to maneuver,
making it easier for the craft to perform tasks like sweeping for
landmines. The researchers explain that the mechanized fin, which is
constructed with a new electricity-conducting polymer, would not create a
backward drag. "If we could produce AUVs that can hover and turn and store
energy and do all the things a fish does, they'll be much better than the
remotely operated vehicles we have now," said MIT researcher James
Tangorra. The researchers are constructing multiple prototypes of the fin.
They explained that bluegill sunfish move forward constantly, and their
fins are capable of changing shape.
Click Here to View Full Article
to the top
Encrypting the Future
Government Computer News (08/06/07) Hickey, Kathleen
Although the cryptographic security standards used in public-key
infrastructures, RSA and Diffie-Hellman, have not been cracked, they were
introduced in the 1970s and there is growing concern that the standards may
soon be outdated. Consequently, the National Security Agency wants to
switch cybersecurity to elliptic-curve cryptography (ECC) by 2010, the same
year the National Institute of Standards and Technology plans to recommend
all government agencies switch to ECC, according to Dickie George,
technology director of the NSA's information assurance directorate. Using
current standards requires continually extending the key lengths, which
increases processes time and makes it difficult to secure small devices.
EEC is a mathematical algorithm that is used to secure data in transit, and
because it provides greater security using a smaller key size, it takes
less computational time and can be used on smaller devices, like cell
phones, wireless devices, and smart cards. Stephen Kent, chief scientist
at BBN Technologies, says to make RSA and Diffie-Hellman keys, which
currently can extend up to 1,024 bits, secure for the next 10 to 20 years
the keys would have to at least double in length, and eventually expand up
to 4,096 bits. Switching to EEC, however, will require a massive
replacement of hardware and software, and with more than a million
different pieces of equipment that need to be changed to EEC, it could take
the NSA more than 10 years to complete the process. George says the move
to ECC is more than just replacing an encryption system, and is actually
upgrading the entire communications structure, which the NSA will use to
work more closely with other governments, U.S. agencies and departments,
first responders, and the private sector. Interoperability is key to the
new communication program and the reason behind the Cryptographic
Modernization initiative, which was started in 2001 and promotes ECC.
Experts agree that there is no new technology comparable to ECC. "ECC is
the only impressive thing out there," Kent said. "People don't get excited
every time a new thing comes along. We wait several years and let people
try to crack it first. ECC definitely passed the test in this regard."
Click Here to View Full Article
to the top
Artificial Intelligence Is Lost in the Woods
Technology Review (08/01/07) Vol. 110, No. 4, P. 62; Gelernter, David
Yale University computer science professor David Gelernter thinks the idea
of constructing a conscious mind out of software running on a digital
computer is an impossibility, but believes an unconscious mind can be
simulated. He says simulated emotions rather than genuine emotions will be
sufficient for this purpose, although simulating emotions is no easy task,
while the representation of memories will also be a formidable challenge.
"Consciousness is necessarily subjective: you alone are aware of the
sights, sounds, feels, smells, and tastes that flash past 'inside your
head,'" writes Gelernter. "This subjectivity of mind has an important
consequence: there is no objective way to tell whether some entity is
conscious. We can only guess, not test." The author maintains that a
conscious mind derived from software would be of little use; it could pass
the Turing test for machine intelligence, but its emotional experience
would be limited by the lack of a physical body, and thus its communication
with human beings would be restricted to a highly superficial level.
Gelernter suggests a "cognitive continuum" of mental states between highest
and lowest focus levels exists, which plays a key role in one's mode of
consciousness at any given moment. Acceptance of the cognitive continuum's
existence could facilitate the modeling of thought dynamics in software,
and Gelernter reasons that analogy discovery--the mechanism of
creativity--could also be explained and modeled in this way.
Click Here to View Full Article
to the top
Computer Graphics Spills from Milk to Medicine
PhysOrg.com (08/07/07)
Researchers at the University of California San Diego computer graphics
department have developed a program that can determine the type of milk -
skimmed, 2 percent, or whole- by examining how light interacts with the
ratio of fat and protein. The program, which is described in a paper
presented at SIGGRAPH, can also be used to generate images of milk, and
other liquids, that display the exact properties of the desired liquid, be
it whole milk, skim milk, or water mixed with a vitamin or protein. The
program eliminates the restrictions of the Lorenz-Mie theory, which has
existed for more than a century and was introduced to graphics in 1995.
The Lorenz-Mie theory is a complete solution to Maxwell's equation for
scattering of electromagnetic waves by a homogenous, spherical particle set
in a non-absorbing medium. "We have the first complete, bottom-up
theoretical model that addresses the shortcoming of the Lorenz-Mie theory
for participating media. It allows us to render computer graphics for
absorbing materials and with non-spherical particles based on the contents
of the material," says Henrik Wann Jensen, a professor of computer science
at UCSD and an Academy Award winning computer graphics researcher.
"Computer graphics is no longer just about pretty pictures and realism for
the sake of aesthetics. We have harnessed the math and physics necessary
to generate realistic images of a wide range of natural materials based on
what they are made of. With our approach, computer graphics can contribute
to a handful of pressing problems including food safety and climate
change," says Jensen. "Putting the model in reverse, grocery stores could
identify spoiled meats, contaminants, or other food safety issues - if a
particular food problem consistently and detectably changed the light
scattering properties of the food."
Click Here to View Full Article
to the top
The Scientific Research Potential of Virtual
Worlds
Science (07/27/07) Vol. 317, No. 5837, P. 472; Bainbridge, William Sims
Online virtual worlds can be useful research tools for behavioral, social,
and economic science, along with human-oriented computer science, writes
William Sims Bainbridge of the National Science Foundation's Division of
Information and Intelligent Systems. Popular worlds such as Second Life
(SL) and World of Warcraft (WoW) are accessed through personal computers
running special software that links to one or more servers that pass data
back and forth between users over the Net, and these simulations involve
three-dimensional spaces inhabited by manipulable objects, currency, and
sometimes interactive artificial intelligence characters. SL is
particularly amenable to formal experiments in social psychology or
cognitive science because it can support a virtual facility and enlist
research subjects like an actual laboratory, while WoW may be more suitable
to nonintrusive statistical research into social networks and economic
systems by virtue of its ability to produce a huge volume of information on
social and economic transactions. Virtual worlds are a prime environment
for creating online laboratories that can automatically recruit vast
numbers of research subjects inexpensively, an important factor in
experiments designed to explore the dynamics of complex causal systems.
Online game makers might welcome such experimentation as an opportunity to
make game play more interesting for subscribers. There is an ethical angle
to consider in such research, given that it involves human subjects. "It
is especially important to study virtual worlds now, because the current
period of transformation may not last much longer, and because it may be
impossible to reconstruct its key processes and phenomena entirely from
historical records that are naturally preserved," says Bainbridge.
Click Here to View Full Article
to the top