Carnegie Mellon Computer Poker Program Sets Its Own Texas
Hold'Em Strategy
Carnegie Mellon News (07/06/06)
A Carnegie Mellon researcher has developed a computer program based on
game theory that, while unable to defeat the world's leading players,
bested two leading "pokerbots" that rely on human expertise. Instead of
drawing on human knowledge, GS1, developed by computer science professor
Tuomas Sandholm and graduate student Andrew Gilpin, relies on an automated
analysis of the rules of poker. Just as chess was an early test of the
limits of artificial intelligence, today poker is a more sophisticated
challenge because it requires players to make decisions without knowing
what cards the other players are holding. The sheer volume of potential
combinations of bets and cards dealt and on the table--a billion times a
billion--is too large for even the most sophisticated computers to be able
to analyze every hand. The element of the unknown makes poker a better
test of the practical potential of artificial intelligence, Sandholm says.
"A lot of real-world situations have uncertainty in them and you have to
deal with the uncertainty," he said, adding that a poker algorithm could be
used in sequential negotiation and auctions in e-commerce applications.
Sandholm is a prominent researcher in e-commerce, having developed the
fastest algorithms for pairing supply and demand and applied artificial
intelligence to automatically set rules for electronic auctions and
negotiations. In his poker system, the computer precalculates the
strategies for the first two rounds of Hold'Em, then updates the
probability of each potential hand for the third and fourth round by
factoring in betting and the revealed cards. In an attempt to identify
strategically different hands, the pokerbot first groups strategically
equivalent hands together, and then strategically similar hands until it
arrives at a small enough number of groups that it can perform a
computational analysis.
Click Here to View Full Article
to the top
SCinet Satisfies the Need for Speed
HPC Wire (07/07/06) Vol. 15, No. 27,Feldman, Michael
Though the 2006 Supercomputing Conference does not take place until
November in Tampa, efforts to create the conference's network, SCinet,
began at the end of last year. SCinet will provide comprehensive support
for the conference's networking activities, including standard commodity
use for the attendees and high-speed communications for the various
exhibits and demonstrations. Much of the infrastructure for the network is
donated by vendors, and more than 100 volunteers from universities,
national labs, supercomputing centers, and other groups are collaborating
on the development of SCinet. SCinet consists of three networks: the
high-performance network that supports demonstrations and the HPC Bandwidth
Challenge with a 10 GB Ethernet connection; the stable and reliable
commodity network that offers universal wireless access for all conference
attendees; and Xnet, the bleeding-edge network that, though not as reliable
as the other two networks, will be used to showcase next-generation
technologies. Xnet is always a bit of a gamble for conference organizers.
"The rule is that it can't do anything that would endanger the stability of
the other networks," said Dennis Duke, SCinet chairman for 2006. "Apart
from that, they can do anything they want. Last year, they set up an
InfiniBand network that actually carried some wide area network
traffic--very dramatic and successful." The HPC Bandwidth Challenge,
always a popular event, this year will focus more on production-level
networking than raw speed, emphasizing sustainability over peak
performance. Each year, the conference faces the challenge of upgrading
the hosting city's infrastructure to accommodate its bandwidth demands.
Click Here to View Full Article
to the top
43rd Design Automation Conference to Feature 44 New
Exhibitors
Business Wire (07/05/06)
Nearly 250 companies will showcase design tools, services, and technology
during ACM SIGDA's 43rd Design Automation Conference (DAC), including 44
that will be exhibiting for the first time. New exhibitors at the premier
event of the electronic design automation (EDA) industry include Advanced
Circuit Engineers, Barth Electronics, Certicom, Fortelink, Innovative
Silicon, Liga Systems, Polyscale Computing, SimPlus Verification, SPACE
Codesign, and Western Scientific. "We are very pleased with the number of
companies exhibiting for the first time at DAC this year," says Ellen
Sentovich, 43rd DAC general chair. "The exhibit floor is a great panorama
of the latest trends in the EDA community and this year's new exhibitors
will add to the innovative ideas and technologies that emerge each year."
The conference will be held at the Moscone Center in San Francisco from
July 24-28, 2006. The exhibit will run from Monday, July 24, through
Thursday, July 27. For more information about DAC, or to register, visit
http://www.dac.com/43rd/index.html
Click Here to View Full Article
to the top
SensorMap Delivers Real-Time Data on the Go
Microsoft Research (07/05/06) Knies, Rob
Microsoft's SensorMap promises to provide people with a customized search
experience through a combination of static and real-time data that could,
for instance, retrieve a list of local restaurants with a waiting time of
less than 30 minutes. The SensorMap platform will act as a portal for
publishers who want to make their real-time data searchable. The
technology powering SensorMap, developed by Microsoft's Sense Web team,
came together around January, and an enhanced prototype received a positive
reception at the International Conference on Information Processing in
Sensor Networks in April, sponsored by ACM and the Institute of Electrical
and Electronics Engineers. The shortage of data right now is the main
reason that a system comparable to SensorMap has not yet been developed,
says Microsoft's Suman Nath. "Basically, there is this chicken-and-egg
problem. There is no application because there is no data, and there is no
data because people don't know how useful that data is." Once it collects
the information, SensorMap indexes it so that it is searchable. The
database that stores the location data works alongside a Web service that
publishes sensor data and a server-side query processor. Some Web-based
mapping services offer sensor information about weather or traffic, but the
maps are not optimized for retrieval of the most relevant information. The
system maintains a central database that stores the metadata describing the
sensors themselves, and another module that gleans the live data from the
sensors, processes that data, and relays it to the users. The most
significant challenge is processing the queries on a large, distributed
scale, Nath says, though he is encouraged by the way the system has
developed so far.
Click Here to View Full Article
to the top
Software Tools Detect Bugs by Inferring Programmer's
Intentions
University of Illinois at Urbana-Champaign (07/06/06) Kloeppel, James
University of Illinois computer science professor Yuanyuan Zhou and her
students have developed a suite of tools can that identify and correct
software bugs by inferring the intentions of the programmer. The tools
work by making observations of how the programmer writes code. "Most
bug-detection tools require reproduction of bugs during execution," Zhou
said. "The program is slowed down significantly and monitored by these
tools, which watch for certain types of behavior. Most of our tools,
however, work by only examining the source code for defects, requiring
little effort from the programmers." Code in large programs is often
copied and pasted, which, while saving a significant amount of time, is a
frequent cause of bugs. Using data mining techniques, Zhou's CP-Miner
searches through programs for copy-pasted code and scans for consistent
modifications. CP-Miner, which can scan 3 million to 4 million lines of
code in less than 30 minutes, has already found numerous bugs in some of
the most popular open-source applications. Since large programs often rely
on implicit rules and assumptions, Zhou and her students developed the
PR-Miner tool to determine when those rules have been broken. Like
CP-Miner, PR-Miner uses data-mining techniques and works very quickly.
Zhou and her students have also developed tools to help software keep
running even in the presence of bugs, such as the Rx recovery tool. Zhou
says, "Rx is avoidance therapy for software failure. If the software
fails, Rx rolls the program back to a recent checkpoint, and re-executes
the program in a modified environment." Another tool, Triage, identifies
and diagnoses the nature of a failure at the end-user site and helps the
programmer work to correct it.
Click Here to View Full Article
to the top
Creating a Better Sense of 'Being There' Virtually
IST Results (07/05/06)
In an attempt to determine how close virtual reality can come to
simulating the feeling of presence, or "being there," researchers working
under the auspices of the OMNIPRES project have developed a host of metrics
to gauge a person's response to a virtual reality situation. Creating a
convincing sense of presence in a virtual reality environment is critical
for many applications, says Wijnand IJsselsteijn, OMNIPRES project
coordinator. "In presence engineering we're trying to broaden the
human-machine bandwidth," he said. "We need to be able to say how real VR
has to be to work therapeutically, which is the measuring component, but
also how to improve the sense of reality, which is the research component."
The relatively new discipline of presence engineering seeks to enhance
humans' technology experience through a scientific study of the senses. To
develop more sophisticated, intelligent systems that understand humans'
capacities and limitations, it is necessary to study the brain and social
interaction, says IJsselsteijn. The OMNIPRES project created a two-part
"compendium of presence measures." The first, now available online, is a
list of around 30 self-measurement questions designed to assess the
experience from the user's standpoint. The second category involves more
objective measures, including neural correlates. "Typically, methods of
this type are looking for response similarities--testing whether people
respond (physically, emotionally, or psychologically) to VR as they do in
real life," IJsselsteijn said. Though the project officially concluded
last September, the researchers are still developing a handbook that
condenses the knowledge acquired over the study.
Click Here to View Full Article
to the top
Open-Source Lobby Weighs in on EU Patent Law
IDG News Service (07/05/06) Meller, Paul
In deciding the future of Europe's patent system, open-source developers
are second only to corporations in their influence, according to the
European Commission. The business community accounted for 40 percent of
the 2,515 written submissions in a consultation led by the Commission
earlier this year, while the open-source community contributed 24 percent.
Most of the replies from the open-source community were submitted by
Florian Mueller, founder of nosoftwarepatents.com, an interest group that
succeeded in scuttling an earlier proposal for legislation on software
patents. Mueller is fighting efforts for a single, community-wide patent.
The current system, which is four times as expensive as the U.S. system,
requires inventors to apply to the European Patent Office for a patent, and
then register the patent in every country where they plan to use the
invention. The community patent, which has been a longstanding goal of
European politicians, will receive one more hearing next week, and then the
Commission will have to decide whether to go forward with the patent or
scrap the project. Though the Commission claims that private enterprise is
on its side, some prominent industry groups have lobbied against it,
fearing a repeat of the lobbying mess that torpedoed last year's software
patent initiative, known as the directive on computer-implemented
inventions. "To start a debate about the community patent now would be
like opening a Pandora's Box," said Francisco Mingorance of the Business
Software Alliance.
Click Here to View Full Article
to the top
DHS Outlines Plan to Protect Critical Telecommunications
Infrastructure
RCR Wireless News (07/05/06) Weaver, Heather Forsgren
The federal Department of Homeland Security has released the National
Infrastructure Protection Plan (NIPP) designed to protect U.S. critical
infrastructure, including IT and communications networks. Key to the plan
will be a risk-management approach that tailors protection according to the
characteristics of individual sectors. Each sector has been assigned to a
specific department, and essential to the plan will be cooperation from
private companies, including the sharing of sometimes confidential
information. "The National Infrastructure Protection Plan is the path
forward on building and enhancing protective measures for the
critical-infrastructure assets and cyber systems that sustain commerce and
communities throughout the United States," says Homeland undersecretary for
preparedness George Foresman. "The NIPP formalizes and strengthens
existing critical-infrastructure partnerships and creates the baseline for
how the public and private sectors will work together to build a safer,
more secure and resilient America."
Click Here to View Full Article
to the top
Microsoft Warns of 'Acute' UK Skills Shortage
ZDNet UK (07/05/06) Barker, Colin
Microsoft has released a report warning that the United Kingdom faces a
critical shortage of workers with programming skills, citing the decline in
interest among students since enrollment peaked in preparation for the Y2K
bug. The number of applicants in the United Kingdom for degree programs in
computer science, engineering, and information systems has dropped to
pre-1996 levels. "The UK is facing an acute and growing shortage of
high-end software skill," said Microsoft's Michael Bishop. "With the same
passion that young people enjoy the music players and computer games which
the industry develops, they need to realize that their own future lies in
creating the software and the applications that enable those experiences."
Other analysts are more optimistic about the situation, though they
acknowledge that outsourcing and other challenges make this a difficult
time for the industry. The problem affects not only the United Kingdom,
claims Elizabeth Sparrow of the British Computer Society, noting that jobs
are also being lost in Japan, the United States, and many other countries.
The United Kingdom actually enjoys the third-best IT trade surplus in the
world, behind India and Ireland, Sparrow says, adding that one of IT's main
problems is the issue of perception. "Yes, we need some really, really
expert nerds, the ones who are on top of their profession. But for the
bulk we need people with a much broader range of skills. That is not well
understood and is not being put across to people in schools today," she
said. Interest in IT is also undermined by the one-sided media coverage
that paints a consistently gloomy picture, overlooking the numerous
successes credited to UK computer scientists, said Edward Truch of the
University of Lancaster.
Click Here to View Full Article
to the top
Concerns About Fraud Continue to Plague Users of
Electronic Voting Machines
Computerworld (07/03/06) Songini, Marc
A new report warns of vulnerabilities in e-voting machines that could
disrupt upcoming elections unless precautions are taken. The report was
compiled over 18 months by a task force of computer scientists and voting
machine experts set up by the Brennan Center for Justice at the New York
University School of Law. In recent years, half of manual voting machines
throughout the country have been replaced by e-voting systems, such as
touch-screen and optical-scan machines, notes Larry Norden, a Brennan
Center attorney and chairman of the task force. Election officials have
turned to electronic systems to comply with federal requirements, though
Norden said security procedures have not necessarily kept pace with the
technology. The report identified 120 potential e-voting threats, noting
the absence of a detection system for malicious software attacks in most
states. Critics of e-voting security include Ion Sancho, elections
supervisor for Leon County, Fla., who said the report confirms his gravest
fears, but others are less convinced. "The fundamental premise of the
Brennan report and many activists is that it's easy to rig a machine to
throw an election," said Michael Shamos, a professor at Carnegie Mellon
University. "It isn't." The report calls for elections officials to
remove wireless components, randomly audit paper records, and decentralize
the administration and programming of the machines. Norden said there is
still time to implement these precautions before the November elections,
and every secretary of state in the country is receiving a copy of the
report. For information on ACM's e-voting activities, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
Battle Lines Drawn Over Net Neutrality
IDG News Service (07/07/06) Gross, Grant
A number of powerful players are coming out both for and against Net
neutrality, an issue that is currently being debated in the U.S. Congress.
On one side of the debate are broadband providers such as AT&T and
BellSouth, which argue that a Net neutrality law would prevent them from
exploring new business plans as a way to pay for more advanced broadband
networks. Broadband providers also say that they should be free to divide
up their broadband pipes to offer services such as IP-based television
services. One such business plan that officials from AT&T and BellSouth
have advocated in recent months would be to charge e-commerce companies to
get preferential routing for traffic to their sites. However, executives
at Internet companies such as Amazon.com and Google say they already pay
millions of dollars in Internet fees every year. They also say the lack of
a Net neutrality law--coupled with recent decisions by the FCC and the U.S.
Supreme Court that effectively deregulated broadband--would give broadband
providers free rein to block or degrade Web content from competitors such
as independent VoIP or video providers. For their part, broadband
providers have repeatedly said they will not block or impair their
customers' existing access to competing content or services. A number of
manufacturers--including Alcatel, Cisco, Corning, and Qualcomm--also say
broadband providers do not block or impair competing content. In a May 17
letter to congressional leaders, the manufacturers pointed out that passing
a Net neutrality bill risks "hobbling the rapidly developing new
technologies and business models of the Internet with rigid, potentially
stultifying rules."
Click Here to View Full Article
to the top
Tech Pros Aren't Worried About Losing Jobs, at Least for
Now
InformationWeek (07/05/06) McGee, Marianne Kolbasuk
Technology workers are feeling more confident about their job security
than they have in years, a new Hudson report finds. The 19 percent that
are worried about job losses is the lowest figure since Hudson began
conducting monthly job confidence surveys two years ago; in May, 28 percent
of tech workers were concerned about job losses. The shifting business
practices at many companies are reflected in the improved perception of job
security, according to Hudson's Jeff Nicoll. With many companies
standardizing business practices such as recruiting, hiring, and backroom
operations on a national level, technology workers are enjoying a stronger
sense of ownership and empowerment, while also feeling more confident about
their contribution to a team effort. Job confidence among technology
workers was 108.2, measured against a base score of 100--the highest of all
industries surveyed. The average score was 102.4.
Click Here to View Full Article
to the top
How Humanoids Won the Hearts of Japanese Industry
Financial Times (07/03/06) P. 8; Cookson, Clive
More than any other country in the world, Japan has embraced the
development of humanoid robots, with projects underway at major companies
such as Honda, Toyota, Hitachi, NEC, and Mitsubishi. Sony, the conspicuous
exception, pulled the plug on its Aibo and Qrio robots earlier this year
due to cost constraints. Today's robots have little commercial appeal
because they are prohibitively expensive and generally not intelligent or
flexible enough to be particularly useful. "A bipedal walking robot today
costs more than a Ferrari," said Hiroshima Hirukawa, who runs the humanoid
robot program at the National Institute of Advanced Industrial Science and
Technology. "If we can find a nice application and sell a million of them,
the price would fall to that of a cheap car." While the entertainment
industry currently holds the most promising applications for robots, the
ultimate goal is for them to take on monotonous or unpleasant tasks that
humans are reluctant or unwilling to do. In countries such as Japan, which
has a declining working-age population, robots could be used to care for
the elderly. The Japanese are generally more accepting of integrating
robots into their everyday lives, a matter that has been the subject of
considerable study by sociologists. Though Japan is well ahead of other
countries in its robotics programs, manufacturers realize that a mass
market will only gradually materialize as robots become smarter and more
agile. Japanese robotics professor Takeo Kanade, who has worked at
Carnegie Mellon University since 1980, says, "The two biggest mechanical
issues are to make them faster and safer in their movements. But those
will be easier to solve than giving robots human-like intelligence."
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Flexible Display Technologies to Provide New Twist for
Computing
Computerworld (07/03/06) Mitchell, Robert
In the coming decade, flexible substrates such as thin-film polymers could
dramatically reshape the way people use displays, with flexible e-paper and
roll-up displays poised to become a viable commercial reality. Laptops
could soon have a second display that is accessible even when the device is
turned off, and some displays could even be embedded in a shirt sleeve or a
watch band. "We're talking about electronics we can wrap around a pencil,"
said Hewlett-Packard's Jim Brug. Rather than replacing existing LCDs, HP
and other companies are developing flexible displays, some as large as 14
inches diagonally, to augment current display technology and expand into
new surfaces. Creating a flexible display out of conventional LCDs entails
eliminating the backlight and substituting a flexible surface for the two
layers of glass that sandwich the thin-film transistor layer, which is
etched onto glass and embedded in amorphous silicon, as well as the liquid
crystal layer. Liquid crystals are resistant to bending, however, and
flexing the substrate distorts the image. Among the alternative
technologies that companies are currently working on to develop flexible
displays are reflective e-paper and emissive organic light-emitting diode
(OLED) technologies. As an alternative to amorphous silicon, researchers
are exploring the technique of "ink jet" printing, using organic materials
to print transistors directly onto a thin polymer sheet. Other
possibilities include imprint lithography and stainless-steel foil
substrates that are capable of withstanding high temperatures. The first
incarnation of e-paper displays has been used in e-book readers, store
signage, and retail-shelf price labels. E-paper will become a commercial
reality before OLEDs, according to iSuppli's Kimberly Allen.
Click Here to View Full Article
to the top
The Net Reloaded
New Scientist (07/01/06) Vol. 191, No. 2558, P. 40; Krieger, Kim
The discovery of a "power-law" pattern consistent across all different
kinds of networks, including the Internet, led to the theory that the
Internet could be undone if its most highly connected computers were taken
out, but increasing numbers of scientists are disputing this "scale-free"
theory. They are crafting postulations that consider the design,
evolution, and structure of specific networks, and these theories underlie
statistical complex-system modeling techniques. The most commonly accepted
explanation for the occurrence of power-law distributions in networks is
the concept of preferential attachment, which dictates that well-connected
things tend to accrue more and more connections. But John Doyle of the
California Institute of Technology argues that because the most
highly-connected U.S. Internet routers are situated on the network's
fringes, taking these machines down would have no impact on the worldwide
flow of most Internet data. "The approach of scale-free models was
diametrically opposite to the types of models that are truly useful, which
are grounded in specificity," explains MIT science historian Evelyn Fox
Keller. Doyle is a vocal advocate for an alternate model of power-law
networks that proposes highly optimized tolerance (HOT), based on the idea
that a network's evolutionary path is based on what it is designed to do
and its physical limitations. The simplest HOT model is the profit-loss
resource model, which sums up a complex system as a conflict between the
resource and the loss, with the model assuming that the system can be
configured in a single optimal manner. Though Harvard University computer
scientist Michael Mitzenmacher concedes the usefulness of HOT, he notes
that the theory, like preferential attachment, cannot account for all
system behavior.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Take the Agile Path to True Balance
Software Test & Performance (06/06) Vol. 3, No. 6, P. 26; Leffingwell,
Dean
Agile software development is superior to the waterfall development model
because it supports improved productivity and software quality, and the
various agile techniques are consistent in that they enable a team to
deliver a small amount of working code in a time box. The agile testing
phase is distinctive in that code is tested as it is written, and thus the
functional barriers between product owners, developers, and testers are
removed. Agile teams adhere to the principles that all code is tested
code; that tests are written either prior to or simultaneously with the
code itself; that the test writing process requires the participation of
all team members; and that a running system baseline is always in effect.
There is general consensus between most agile teams that testing practices
fall within four primary tiers: unit testing, acceptance testing,
component testing, and system and performance testing. Unit testing
entails the writing of testing code by developers to test their target code
at the module level, while acceptance testing involves functional testing
by any stakeholder who can assess any new code that has been authored
against its requirements. Component testing is performed with tools and
practices that ought to be applicable to languages the agile teams use and
the kinds of systems they are constructing. Finally, system and
performance testing is geared toward evaluating the system in its entirety,
often by testing it against other systems that may not even be accessible
to the team. The point of all these testing practices is for the team to
add a small amount of code and ascertain within hours that the system still
fulfills all the needs that have been imposed on it to date, and teams
commit a certain amount to time within an iteration to "hardening," in
which the activities that need to be done before the system is ready for
release are carried out.
Click Here to View Full Article
to the top
Re-Centering the Research Computing Enterprise
Educause Review (06/06) Vol. 41, No. 3, P. 84; McRobbie, Michael A.
The key issue for institutional central data centers has shifted from
physical size and floor space to reliable power and heat management, writes
Indiana University interim provost and vice president for academic affairs
Michael A. McRobbie. But he argues against the decentralized data center
management that many schools and departments are calling for, and posits
instead that data center centralization is more critical than ever,
particularly as it relates to research computing. "I assert that these
resources could be used much more efficiently and effectively to provide a
better quality of services supporting an even larger amount of excellent
research," McRobbie attests. A coordinated, central strategy for
supporting IT in research will help institutions facing unaffordable data
center costs by prioritizing research requirements, coordinating grant
submissions, and helping fund cyberinfrastructure needs through the use of
indirect cost recoveries. "Through a coordinated, centralized approach,
the resources that departments use to run their own systems can be
reallocated and pooled to achieve economies of scale and to serve
populations incrementally larger than the combined departmental headcount,"
says McRobbie, adding that this model can enable the delivery of
centralized maintenance and support services at less cost and higher
quality of service. He suggests that CIOs ask themselves if such an
approach can yield improved research computing for all stakeholders, and
recommends that the CIO undertake cyberinfrastructure planning in
collaboration with the leading institutional researchers. Given the
specialization of data center design, it makes sense for CIOs to hire
design engineers who can effectively structure centers that address both
present and future needs.
Click Here to View Full Article
to the top
Web Browsing on Small-Screen Devices: A Multiclient
Collaborative Approach
IEEE Pervasive Computing (06/06) Vol. 5, No. 2, P. 78; Hua, Zhigang; Lu,
Hanqing
Zhigang Hua and Hanqing Lu of the Chinese Academy of Science's Institute
of Automation propose a method to make Web browsing on small-screen devices
a less tedious prospect for users with a new multiclient collaborative
system featuring a two-level browsing arrangement. The system
automatically builds an aggregation profile during the initialization of
the multiclient communication in an ambient environment, and the devices
involved in the Web interactions are designated as either master devices
(the device currently being employed for Web browsing) and slave devices
(the ambient devices operating collaboratively based on the user's Web
interactions on the master device). Communication between devices in an
ambient environment is maintained by a communication supporter, while a
user's Web browsing interactions on the master device are monitored by an
interaction watcher. The detected interaction is then fed into a
collaboration translator, whose job is to parse user interactions into
correlating display updates on slave devices. The last component of the
system is a slave device-based collaboration performer, which automatically
displays data updates from the master device on a client browser. The
browsing scheme features within-page browsing and between-page comparative
browsing; the first browsing model involves the display of a thumbnail
overview by a master device and the display of detailed content by a slave
device. The arrangement allows the user to easily choose a different block
on the master device, causing the detailed content to be displayed on
another slave device. In the between-page comparative browsing model, the
master device displays one page and the allows the user to scroll for more
information while the slave device shows the comparative page and
automatically scrolls to the matched term of interest.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top