Social Network Sites Seek Common Software Standard
New York Times (11/02/07) P. C7; Helft, Miguel; Stone, Brad
MySpace and Bebo have joined a Google-led alliance promoting OpenSocial, a
common set of standards for software developers to write programs for
social networks. The new alliance puts pressure on Facebook to join
OpenSocial. Facebook opened its site to developers last spring. By being
the first social networking site to allow independent developers to create
programs for other users, Facebook drew a significant crowd away from
MySpace, the world's largest social network with 110 million active
members. Bebo is the largest social network in Britain with 39 million
active users. MySpace CEO Chris DeWolfe says OpenSocial will be bigger
than any other platform with about 200 million active users. The open
standard could lead to a wave of innovation on social networks as
applications can easily reach more users than ever before, encouraging
developers to create new Internet tools. Google and other alliance members
say they have invited other social networks, including Facebook, to join
the alliance, saying the most important aspect of open projects is that
everyone participates. Facebook says it has not been fully briefed on the
initiative, but that it would evaluate OpenSocial once it has a chance to
study it. The standard does not mean that every program will successfully
work on every social site, as different sites may not incorporate the
standard as deeply or as effectively as other sites.
Click Here to View Full Article
to the top
11 Finalists to Hit the Streets in DARPA's $2M Urban
Challenge
Computerworld (10/01/07) Gaudin, Sharon
The field of 35 semifinalists in DARPA's Urban Challenge has been narrowed
to 11 finalists that will compete on Saturday, Nov. 3, at George Air Force
Base in Victorville, Calif. The driverless cars will have to navigate 60
miles of urban streets, multiple lanes, traffic circles, and four-way
stops. The finalists include entries from Virginia Tech, Cornell, Carnegie
Mellon University, and Stanford. DARPA director Tony Tether says the
National Qualification Event tested the robotic cars' ability to merge into
traffic, navigate four-way intersections, respond to blocked roads, pass
on-coming cars on narrow roads, and keep up with traffic on two-lane and
four-lane roads. Tether says the only real difference between the
qualifying event and the final test is that there will be multiple robotic
vehicles on the course at the same time. "Vehicles competing in the Urban
Challenge will have to think like human drivers and continually make
split-second decisions to avoid moving vehicles, including robotic vehicles
without drivers, and operate safely on the course," says Urban Challenge
program manager Norman Whitaker. "The urban setting adds considerable
complexity to the challenge faced by the robotic vehicles, and replicates
the environments where many of today's military missions are conducted."
The wining team will receive a $2 million prize, second place will receive
$500,000, and third will receive $250,000.
Click Here to View Full Article
to the top
Minority Professors Are Underrepresented in Top Science
Programs, Report Says
Chronicle of Higher Education (11/01/07) Wilson, Robin
Students in underrepresented minority groups earn undergraduate and
doctoral degrees in science and engineering at much greater rates than
minority professor hold faculty jobs in those same disciplines at the
nation's top research universities, concludes a new report funded by the
National Science Foundation and the Alfred P. Sloan Foundation. The
report, "A National Analysis of Minorities in Science and Engineering
Faculties at Research Universities," says that the gap between the number
of underrepresented minority students earning science and engineering
degrees and the number of minority faculty members teaching in those
disciplines means minority students have very few same-race role models,
which could lead to fewer minority students who are interested in pursuing
those disciplines. The report found that in computer science,
underrepresented minority students earned 20.6 percent of bachelor's
degrees in 2005, and 6.6 percent of Ph.D.'s from 1996 to 2005, while
minority professors represented only 2.8 percent of faculty members at the
nation's top 100 universities. Women, however, now represent a larger
portion of the assistant professors in science and engineering than they
did in 2002. The proportion of female assistant professors rose by 10
percent in both computer science and economics. Nevertheless, large
disparities still exist between the portion of doctorates awarded to women
and the percentage of faculty positions in those disciplines that are held
by women.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Computer Science Students Present at Prestigious
OOPSLA
University of Minnesota Morris (10/30/2007) Riley, Judy
The poster session at ACM SIGPLAN's Object-Oriented Programming, Systems,
Languages, and Applications (OOPSLA) conference in Montreal gave computer
science students at the University of Minnesota, Morris an opportunity to
show off their research on Java generics. Eli Mayfield and Kyle Roth
teamed up with Daniel Selifonov and Nathan Dahlberg to create the poster,
which was titled "Optimizing Java Programs Using Generic Types." The UMM
poster was the only submission by undergraduate students. "OOPSLA is a
very prestigious conference and this is a very impressive achievement,"
says Elena Machkasova, adviser and UMM assistant professor of computer
science. In addition to the presentation on Oct. 22, Mayfield and Roth
also met with Guy Steele and other prominent researchers, and attended
talks at the conference.
Click Here to View Full Article
to the top
Microsoft Hopes ScRGB Will Improve Photo Colors
CNet (11/01/07) Shankland, Stephen
The depth and richness of photos captured by digital cameras and viewed on
a computer or television screen could be enhanced with the adoption of
scRGB, a color space developed by Microsoft that can encode colors as
numbers a computer can process. The current color space utilized by
today's computers and cameras describes colors as a specific combination of
red, green, and blue, which limits the breadth of displayed colors as well
as the nuances of the tonal shades that separate bright from dark. "ScRGB
would allow a richer saturated red value ... than the sRGB limit for red,"
says leader of Microsoft's HD Photo initiative Bill Crow. Kevin Connor
with Adobe's digital imaging group notes that the color management problem
is amplified by the sRGB color space's small range in comparison to scRGB.
The drawbacks of sRGB are especially apparent to enthusiasts of
high-dynamic range photography, in which multiple photos taken at different
exposures are integrated into a single image. ScRGB is capable of further
extension and finer subdivision than sRGB, and the new color space can use
far higher numbers of bits to describe each pixel, as well as employ
integers and floating-point numbers to define gradations from light to
dark. Crow says scRGB has been a royalty-free standard for four years and
is thus "free for anyone to use," but adoption is more likely to be
stimulated by the fact that support for the color space is embedded within
Windows Vista and the HD Photo file format Microsoft is trying to
standardize.
Click Here to View Full Article
to the top
Machine Learning Fuels Sun Music Recommendation
Technology
Network World (10/31/07) Brodkin, Jon
A Sun open source project dedicated to building a music recommendation
system is using software that listens to and analyzes music. Automated
recommendation systems are generally based on who is listening to the
music, not on an analysis of the music. Consumer preferences can be
drastically different, so these systems sometimes create abnormal and
somewhat erroneous recommendations. To create a better recommendation
system, Sun has developed music similarity algorithms that examine the
actual sound of the music using machine learning that analyzes features
such as frequency and beats per minute to determine the genre and what
instruments are being played. The system could make it easier for new
artists to be found online, an increasingly difficult task as more and more
of the musical library is stored online, says Paul Lamere, principal
investigator of Sun's Search Inside the Music. "Recommendation technology
is key," Lamere says. "The Web is going to be filled with billions of
tracks and there's going to be millions of tracks arriving every week. The
question is, when you have a million songs in your in-box, how are you
going to find something you really like?" Beyond sound recognition, Sun
has also included a tagging system that categorizes music based on its
attributes, with descriptions such as quirky, indie, rock, fast, cute, or
fun. The tags are generated by search through reviews, lyrics, music
blogs, social tagging sites, and artist biographies. Using a comprehensive
search of the Web to create tags prevents people from manipulating the
system by adding their own tags to make a track more popular. The search
engine also provides links to videos, pictures, and information on upcoming
concerts.
Click Here to View Full Article
to the top
How to Organize the Web
Technology Review (11/02/07) Naone, Erica
Microsoft's Live Labs is developing Listas, a new Internet tool designed
to offer a way of organizing online content. Listas is based on creating
lists, either by typing in original content, taking clippings from Web
pages, or reading and editing public lists. The lists can include almost
any type of content, including images and videos, can be public or private,
and can be tagged to make searching easier. Listas also allows users to
acknowledge each other as "friends." Lists made by a user, that user's
friends, and public lists the user links to are all collected on a single
page. Downloading and installing the optional Listas toolbar, which works
with Internet Explorer, makes it easier to select items from Web pages such
as text, URLs, blog posts, or product listings and add them to a list.
"Lists are a fundamental data type across the Web," says Live Labs product
manager Alex Daley. "A great deal of the information we produce and
consume across the Web is in this structure." Daley says Listas' greatest
virtue is its generality, which allows users to organize data however they
see fit. Live Labs director Gary Flake says Listas was created because he
had a feeling that his online information was spread out everywhere and no
longer under his control, noting that the more involved a person is in
online communities, the more severe the problem can become.
Click Here to View Full Article
to the top
Researchers Dig for Hidden Links in Spam
IDG News Service (10/31/07) Kirk, Jeremy
The links in spam messages are often used by filtering programs to
determine if the message should be blocked, but spammers find loopholes by
creating links that cannot be identified by filters but are still valid
links, says University of Quebec software engineering professor Christopher
Fuhrman. Spammers change and hide these links by altering the HTML enough
to confuse filters but keep the links readable by browser rendering engines
and email servers. Fuhrman believes that spammers test their altered links
on Microsoft's Outlook program because it uses the same HTML rendering
engine as the Internet Explorer browser. To find spammers' hidden links,
Fuhrman is writing a program that uses Internet Explorer's rendering engine
to parse out the links. Although some services already use algorithms to
parse out the links in spam, the algorithms are hard to write and Fuhrman
is interested in finding a way to parse messages without having to
constantly tweak algorithms to keep up with new tricks used by spammers.
Fuhrman says it is difficult to write a parser that will read links the
same way Internet Explorer's rendering engine does because Microsoft's
source code is secrete, so it is better to use the engine as part of the
program. Any links that Internet Explorer's engine finds would be reported
to a blocklist service. "I want to ultimately get it as a Web-based engine
so that users can paste spam, and when it comes out, it will reveal the
links," Fuhrman says.
Click Here to View Full Article
to the top
Cellphones Team Up to Become Smart CCTV Swarm
New Scientist (10/31/07) Simonite, Tom
Swiss researchers at the Institute for Pervasive Computing in Zurich have
developed Facet, Java-based software that enables camera cell phones to act
as a smart surveillance network capable of spotting intruders or
identifying wildlife. Facet uses Bluetooth wireless technology to
automatically share and analyze information collected by a group of
networked cell phones. To test the software, the researchers attached four
cell phones with Facet to the ceiling of a corridor. The phones were
angled so the camera on each phone covered a different part of the
corridor. Whenever one of the phones detected an object entering or
exiting its field of focus, a message was sent to alert nearby phones,
which relayed the message to the next nearest phones, and so on until the
entire network had been alerted. One of the phones also alerts a computer
using a normal GPRS cell phone connection. The network calculates the
distance between each phone by comparing when a person enters and exits a
phone's vision against the average walking speed of a human. Knowing the
shape of the network allows the phones to perform complex tasks, such as
reporting when someone walks down a specific corridor or sounding an alarm
if a dangerous animal approaches a camp site. After some improvements to
the software the researchers will release Facet as an open-source project,
allowing anyone to modify the code for their own uses.
Click Here to View Full Article
to the top
Media X Researchers to Explore Fusion Between the Virtual
and Real Worlds
Stanford University (10/31/07)
Stanford University's Media X unites academic researchers and industry
partners to study interactive communications and technology by integrating
communications, engineering, humanities, law, medicine, business, and
design studies. Media X grants have been awarded to seven
multidisciplinary teams of researchers dedicated to exploring how people
use and share information and collaborate in virtual worlds. "The fusion
of virtual and physical worlds for advanced communications represents a new
field of interdisciplinary inquiry," says communication professor Byron
Reeves, co-founder of Media X and the Human Sciences and Technologies
Advanced Research Institute (H-STAR). The seven Media X grants will be
used to study a variety of topics. One study will explore the
virtual-physical-social interplay by examining how social experiences and
interactions in physical places change with the addition of digital
information, and how digital experiences change with physical information.
Another study will develop virtual sensornets to allow scientists to
construct instruments for measuring what is happening in virtual worlds,
and will allow users to control and monitor what is being recorded. Other
studies will examine if virtual worlds create optimal conditions for
learning and what are the legal regimes that govern virtual communities.
Click Here to View Full Article
to the top
New List Ranks 'Green' Supercomputers
LiveScience (10/31/07) Malik, Tariq
Virginia Tech computer scientist Kirk Cameron and colleague Wu Feng are
creating a list of supercomputers that ranks the most powerful computing
machines in the world on performance and speed, as well as on their energy
efficiency and reliability. Cameron and Feng note that costs are rising
for Japan's Earth Simulator supercomputer because of the sophisticated
cooling systems needed to handle the enormous amount of heat it produces.
They add that Google's new data center at The Dalles, Ore., will use
affordable local power and cooling water from a nearby river. Feng headed
the development of the low-power, high-performance Green Destiny, a machine
that ranked with the Cray T3D MC1024-8 supercomputer at No. 393 on the Top
500 list in 2003. They will introduce the Green500 Supercomputer List in
November. "Over time we anticipate increased participation and
improvements in the ability of the list to reflect high performance and
energy efficiency as technologies improve," Cameron says.
Click Here to View Full Article
to the top
SDSU Experts Work Around the Clock to Gather Critical
Fire Data
SDSUniverse (10/26/07) Coartney, Lauren
Scientists at San Diego State University's SDSU Immersive Visualization
Center have been working almost nonstop since the first fires were reported
near Santa Ysabel. "We have links to all of these feeds, like NASA, Google
Maps, and Predator video," says Viz Lab co-director Eric Frost. "We're
constantly looking for more sources of precise information and different
ways to visualize it so we can show the location of the fires as close to
real time as possible." Normally, the Viz Lab is used to collect, process,
and analyze images from around the world for a variety of purposes, such as
finding sources of oil or revealing the sketches hidden beneath Leonardo da
Vinci paintings. The lab is also used to track and analyze natural
disasters, but this time the disaster's much closer. Frost and his team
have been generating data to fill in the gaps between information sources
and feeding the information to local emergency responders to help them
decide where resources are most urgently needed. "There's a technology
being used today that allows video taken by helicopters to be downloaded to
public safety Internet, providing information about where the fires are and
what the helicopters are seeing as they're fighting them," says Bob Welty,
SDSU Viz Lab co-director and SDSU Research Foundation director of homeland
security projects. Members of Google's disaster imagery team have
established temporary residence in the lab since the start of the fires and
provided the lab with video from an unmanned aerial drone to monitor the
fire from the air through Google's partnership with NASA.
Click Here to View Full Article
to the top
Germany Seeks Expansion of Computer Spying
Los Angeles Times (10/30/07) Murphy, Kim
German law enforcement authorities want to expand government-sanctioned
computer surveillance, citing the case of an abortive bombing in which
plans for the attack were on the laptop of one of the suspects. "What this
case showed us is that they are using laptops, they are using computers,
and it would have been very, very helpful to track them down with online
searches," says Gerhard Schindler, director of the German Interior
Ministry's counter-terrorism bureau. Germany is seeking authorization to
plant clandestine Trojans into suspects' computers so that files, photos,
diagrams, voice recordings, keystrokes, and other information can be
scanned and recorded. This proposal does not sit well with a nation whose
people carry bitter memories of official surveillance under past regimes.
The Interior Ministry reports that laws authorizing online searches have
already been passed in several European countries, and several more allow
such searches or are in the process of adopting similar legislation.
German parliament member Hans-Christian Stroeble says physical computer
searches are already permissible with court approval, but secret searches
would completely bypass legal procedures. "What we fear is that without
any hint of a criminal background, police can secretly go into computers,
maybe even the computers of political opponents, and spy them out, gaining
access to personal data like photos, diaries, love letters, things like
that," he says. Marc Rotenberg of the Electronic Privacy Information
Center says he has no awareness that searches via implanted software are
being carried out by U.S. authorities, but he notes that "it's also not
clear, given the current view of the president on his powers to conduct
electronic surveillance, that it hasn't been used."
Click Here to View Full Article
to the top
PS3 Network Enters Record Books
BBC News (11/02/07)
The processing power of Sony's PlayStation 3 (PS3) has helped make the
folding@home (FAH) project the most powerful distributed computing network
in the world. According to Guinness World Records, FAH has more than 1
petaflop of computing power, which is the equivalent of 1,000 trillion
calculations per second. Through March of this year, FAH had signed up
about 200,000 PCs, giving it about 250 teraflops of computing power, but
the addition of 670,000 PS3s has pushed the project over the top. The
BlueGene L is the fastest supercomputer, and it only reaches a top speed of
280.6 teraflops. Consumers and gamers participating in FAH are giving the
project spare processing power so researchers can study how the shape of
proteins affects various diseases, including Alzheimer's. PS3 uses the
"cell" processor, which is up to 10 times faster than current PC chips.
"It is clear that none of this would be even remotely possible without the
power of PS3, it has increased our research capabilities by leaps and
bounds," says Vijay Pande, a Stanford University professor who heads the
FAH project.
Click Here to View Full Article
to the top
Al Qaeda Hacker Attack Scheduled to Begin November
11th
InformationWeek (11/01/07) Claburn, Thomas
DEBKAfile, an Israeli news site, asserts that Western, Israeli, Jewish,
Shiite, and Muslim apostate Web sites will be attacked by al Qaeda hackers
beginning on Nov. 11. DEBKAfile claims that Bin Laden's "cyber legions"
are getting even with Western surveillance systems that have persistently
and effectively suppressed Al Qaeda's Web presence. The Department of
Homeland Security emphasizes that DEBKAfile's report does not represent an
official U.S. alert, though the agency intends to seriously investigate the
threat, as it does all threats. Although Forbes and Wired News have
praised DEBKAfile for its journalism in the past, others debate the
trustworthiness of the source. Nonetheless, software called Electronic
Jihad 2.0 is obtainable online, and the most recent version of the software
facilitates a distributed denial of service attack. Though the idea of Al
Qaeda being involved in a cyberattack is worrisome, the menace is no more
perilous than everyday security risks facing Internet users, says Marc
Zwillinger, a former cybercrime prosecutor with the Department of Justice.
In addition, modern networks are better equipped to handle denial of
service attacks than networks from several years ago, he says.
Click Here to View Full Article
to the top
Setting a Cybersecurity Agenda for the 110th
Congress
Government Computer News (10/31/07) Jackson, William
At the Congressional High Tech Caucus on Wednesday more than four dozen
representatives and senators started work on an IT legislative agenda for
the 110th Congress. Although numerous bills on computer crime,
infrastructure protection, spyware, and data breaches have been introduced
in both houses, and a number of bills are pending, few have made it to a
vote, and even fewer have become law. At the caucus the Consumers Union's
Jeannine Kenney pushed for a strong national breach notification law to
help protect personal identification from theft or exposure. "Industry and
government are not investing in cybersecurity measures," Kenney says. "We
need to create incentives to make these investments. One way to do that is
requiring that consumers are always notified when their personal
information is breached." Many in the information technology industry want
to see a national standard replace the 35 different state notification
laws, while the Cyber Security Industry Alliance says any notification law
should include safe harbors for businesses that deploy strong, pre-breach
security measures. Both Consumer Data Industry Association President
Stuart Pratt and Homeland Security Department chief privacy officer Hugo
Teufel III say collecting personal data can improve security and the
resulting risks to privacy are an acceptable trade-off, arguing that data
collection has been used to prevent fraud and that security and privacy go
hand in hand.
Click Here to View Full Article
to the top
The Grill: Software Guru Grady Booch Is Hot on Linux,
Second Life and Busting Bureaucracy
Computerworld (10/29/07) Vol. 41, No. 44, P. 28; Haverstein, Heather
IBM Rational Group chief scientist, Grady Booch, developer of the Unified
Modeling Language, says that software development is and will remain
fundamentally hard, and that every era will face a new level of complexity.
"Most of the interesting systems today are no longer just systems by
themselves, but they tend to be systems of systems," Booch says. "It is
the set of them working in harmony. We don't have a lot of good processes
or analysis tools to really understand how those things behave. Many
systems look dangerously fragile. The bad news is they are fragile. This
is another force that will lead us to the next era of how we build software
systems." Booch says that open source development represents an economic
process where you find some applications that otherwise would not be
developed because they would not make money. He also says the operating
system wars are largely over and it would be best to decide on a common
platform. As for virtual worlds, and the numerous companies that are
pulling out of them, Booch says that they probably entered for the wrong
reasons. People are not going to go to a store in a virtual world to buy a
real world product, but virtual worlds can still be used by companies to
save money. Booch, for example, has given several lectures through Second
Life that would have otherwise required a lot of travel time and money.
Click Here to View Full Article
to the top
Requirements and Services for Metadata Management
Internet Computing (10/07) Vol. 11, No. 5, P. 17; Missier, Paolo; Alper,
Pinar; Corcho, Oscar
A group of University of Manchester researchers outline the overall
requirements for metadata management and describe a model and service that
concentrates on RDF metadata to fulfill these requirements. They describe
the design of their service-oriented model for format- and
location-independent metadata management as being "based on the observation
that ... two simple properties are common to all metadata: namely, that
it's invariably associated with some underlying resource, and, optionally,
separate meta-information for interpreting metadata--an ontology, for
instance--might be available." Metadata management issues the researchers
focus on include heterogeneous data storage and retrieval, metadata
evolution and lifetime management, and access control to metadata. The
objective of the authors' semantic binding service (SBS) is to deliver a
uniform series of primitives for metadata resource management, although
they note that interoperability among heterogeneous metadata is not one of
the deliverables. "The SBS offers a uniform way to maintain correct
associations among resources, metadata, and knowledge entities whenever
they change, regardless of the differences in format and content among the
metadata elements," the researchers explain. Current metadata repositories
provide low-level data management that has no awareness of the part the
data plays, while the SBS takes advantage of this capability by delivering
a uniform metadata-management layer that is cognizant of the
interrelationships between resources, their annotations, the annotations'
lifetimes, and reference knowledge entities. Among the functions the SBS
can facilitate is the creation and destruction of semantic bindings (SBs)
and the maintenance of their logical state in respect to metadata lifetime;
the provision of service-based access to metadata content by forwarding
application-specific queries to the underlying metadata repository in lieu
of interpretation; and the delivery of a notification service to inform
interested clients of any modifications in the state of SBs, in accordance
with the proposed WS-Notification standard.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top