IBM, CSTA Team to Boost Computing Skills Among High
School Students
IT News Online (04/16/06)
IBM and ACM's Computer Science Teachers Association (CSTA) have joined
forces to promote computer science in high schools and will develop
customized computer-science course materials for more than 36,000
secondary-school teachers, enabling them instantly to access lesson plans,
guidebooks, and subject overviews to integrate coding and Web design into
math and science classes. IBM says that computer science, long established
as a discipline in higher education, has yet to be fully incorporated into
the primary and secondary school curricula in the United States. IBM and
CSTA are moving forward with the new resources after a successful pilot
program. "The structure of the lessons encouraged students to think
through the design of a computer program, from problem statement to
solution. I have found the design process generally hard to teach and
these lessons helped significantly ease my instruction," said Shane
Torbert, a teacher at Virginia's Thomas Jefferson High School for Science
and Technology, which participated in the pilot program. The IBM/CSTA
program is designed to address the concern that there will not be enough
skilled workers to propel the IT industry in the future. The pre-formatted
lesson plans adhere to the curriculum guidelines articulated in the "ACM
Model Curriculum for K-12 Computer Science." The new resources also have a
strong collaborative focus that will teach students how to work together to
solve problems. The IBM Academic Initiative, a training program already in
use at more than 1,900 institutions, will support the new resources, which
can also be downloaded from CSTA's Web page. Among the resources are an
application where students program the video game Pong using Java, as well
as modules detailing the design and development of Web pages and
project-based learning.
For more information on CSTA, visit
http://csta.acm.org
Click Here to View Full Article
to the top
At Computing's Olympics, Russian Teams Take Gold and
Silver--and MIT Finishes 7th
Chronicle of Higher Education (04/13/06) Read, Brock
A team from Saratov State University in Russia earned the highest honors
at the Association for Computing Machinery's (ACM) 30th annual
International Collegiate Programming Contest on Wednesday. Eighty-three
three-student teams competed in the contest where they attempted to solve
10 sophisticated computing problems in five hours. The contest, initially
dominated by the United States, has recently seen teams from Europe, Asia,
and Australia take top honors. MIT, the only U.S. team to finish in the
top 20, placed seventh, solving four of the 10 problems. By successfully
completing five problems, the team from Saratov State earned $10,000
scholarships and bragging rights. The team from Altai State Technical
University, another Russian school, also solved five problems, though it
took longer to do so and settled for second place. Each time a team
completed a problem, contest officials let a balloon float to the top of
the large, open assembly hall where the students worked. Of the 16 teams
from the United States in the contest, only Princeton University and DePaul
University joined MIT in finishing in the top half. Schools with noted
computer science and engineering departments such as Carnegie Mellon, the
California Institute of Technology, and Duke all faltered early. ACM
President David Patterson said the decline of U.S. computer science could
stem from widely publicized fears of outsourcing. "Every high-school
senior thinks every programming job has already gone to India," he said.
"There's this assumption that computer science, as a profession, is
completely over, even though the facts aren't nearly as dismal as the
folklore." Patterson also said that Asian and Eastern European schools
take the contest more seriously than U.S. institutions. U.S. teams could
simply be falling prey to increased competition, said Martin Rinard, who
coached the MIT team. Roughly 5,600 teams attempted to qualify, compared
with 1,100 in 1997, the last year a U.S. team won.
For more information about the 30th ICPC, visit
http://icpc.baylor.edu/icpc
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
New Linux Look Fuels Old Debate
CNet (04/17/06) Shankland, Stephen
The use of proprietary drivers to bring new graphics to the Linux
interface is reviving the debate over whether it is acceptable to use
closed applications in the open-source operating system. On the purists'
side, the Free Software Foundation (FSF) argues that Linux is under the
jurisdiction of the General Public License (GPL), which prohibits the use
of proprietary drivers. Chipmakers, however, are refusing to open access
to their proprietary 3D graphics drivers. "If Linux expects broader vendor
support, the community needs to capitulate to proprietary software
involvement," said Raven Zachary of the 451 Group. Developing graphics
drivers without the support of leading graphics chipmakers Nvidia and ATI
is difficult, and reverse-engineering attempts often fall behind schedule
and produce only pale imitations of the commercial drivers. ATI says that
its Radeon X1000 driver it proprietary because of intellectual property
reasons, while Nvidia's Andrew Fear says the company's GeForce 7 driver is
closed because the level of difficulty entailed in making a graphics driver
is such that open-source development would not help. His willingness to
make some concessions to proprietary technologies, provided they are not
derived from the kernel, casts Linux founder Linus Torvalds in stark
opposition to the FSF and kernel programmers such as Greg Kroah-Hartman,
who developed a patch to block proprietary drivers from loading to his USB
subsystem. Red Hat CTO Brian Stevens argues for the business value of
open-source drivers, noting that the vast talent pool that supports
open-source applications would inevitably improve the product. For its
part, Intel is partnering with the open-source community to develop drivers
that it says should enable it to compete with Nvidia and ATI. Many also
believe that Linux will be more receptive to new drivers once it develops a
stable interface.
Click Here to View Full Article
to the top
Q&A: Gambling on Women Technologists in Las Vegas
Computerworld (04/14/06) Hoffman, Thomas
The question of whether women have broken the glass ceiling in information
technology would have to be answered on a company-by-company basis, says
Laura Fucci, vice president and CTO of MGM Mirage in Las Vegas, in an
interview with Computerworld. "I'm not sure how much of this is about the
company and how much of it is about the women and how they've been brought
up, how they express themselves, whether they have been taught to hold
themselves back, etc.," says Fucci. Nonetheless, Fucci wants to see more
young women pursue careers in IT, and she has played a key role in the
launch of the Las Vegas chapter of Women in Technology International
(WITI). The Las Vegas group, which has attracted 50 members through
word-of-mouth promotion, is scheduled to hold its first meeting on May 3 at
Mandalay Bay Resort and Casino during the Interop 2006 conference. The
initial gathering will focus on creating a local network for achieving
common goals of growth and development, while a June 7 meeting at Southern
Nevada Community College will be devoted to gathering ideas on how to get
young girls interested in IT. Fucci sees opportunities in mentoring and
acting as role models, internships, and scholarships. She says the
perception that IT is for geeks has to be changed. "What I hear is that by
the time girls hit high school, it's already too late to change those
perceptions," says Fucci.
For information about ACM's Committee on Women in Computing, visit
http://women.acm.org
Click Here to View Full Article
to the top
In Silicon Valley, a Man Without a Patent
New York Times (04/16/06) P. BU1; Markoff, John
Silicon Valley entrepreneur Geoff Goodfellow came up with the idea of
sending email to a wireless device in 1982, though instead of a BlackBerry,
Goodfellow was thinking in terms of a pager. He received funding to
develop the service in the early 1990s, though it failed, and Goodfellow
left the country and got out of the technology business. In 2002, James
Wallace, an attorney who represented NTP in its patent dispute against RIM,
flew to Prague to introduce himself to Goodfellow out of concern that his
earlier research could jeopardize NTP's patent claims. Goodfellow eschews
the use of patents to protect his work, however, and Chicago inventor
Thomas Compana Jr., who went on to found NTP, patented the concept for
wireless email nearly a decade after Goodfellow's work. Compana died in
2004, though his patent led to a $612.5 million settlement reached with RIM
last month. Many analysts look to Goodfellow's story as an example of the
inherent flaws in the patent system, which they claim now benefits large
corporations and lawyers more than it protects individual creativity and
innovation. While some experts have argued that Goodfellow's work could
constitute relevant prior art and should have been included in the NTP
case, he is just as happy to steer clear of patent proceedings altogether,
as are many other Silicon Valley innovators. Goodfellow, who was retained
by NTP as a contract consultant for several days' work in 2002, developed
his idea for wirelessly sending messages in his early days in Silicon
Valley, working on the original Arpanet, the precursor to the modern
Internet. He published his idea on an Arpanet mailing list in 1982 in a
post called, "Electronic Mail for People on the Move." Goodfellow
ultimately left Silicon Valley at the peak of the dot-com boom, mildly
disillusioned at the false economies and "zero-billion-dollar industries"
that were making many of his colleagues wealthy, though he has since
returned to chair a startup company developing VoIP technology.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Report Details DMCA Misuses
InternetNews.com (04/14/06) Miller, David
The Electronic Frontier Foundation (EFF) has issued a report criticizing
many of the misuses of the Digital Millennium Copyright Act, the 1998 law
enacted to safeguard intellectual property in the digital era. Among the
stories included in "Unintended Consequences: Seven Years Under the DMCA"
is graduate student J. Alex Halderman's account of how he waited several
weeks before going public with his discovery of the Sony rootkit
vulnerability so that he could consult with his attorneys. SunComm
executives had threatened Halderman with a DMCA suit in 2003 after he
discovered a vulnerability in that company's copy-protection technology.
"Rather than being used to stop piracy, the DMCA has predominantly been
used to threaten and sue legitimate consumers, scientists, publishers, and
competitors," said the EFF's Fred Von Lohmann. The report takes particular
issue with Section 1201 of the DMCA, which bars the circumvention of DRM
technologies, even in cases when circumvention would be logical and
legitimate, such as security research. Violators of the DMCA can face
severe civil and even criminal penalties. The EFF report calls for support
for the Digital Media Consumers' Right Act, introduced by Rep. Rick Boucher
(D-Va.) in March 2005, requiring that a CD must plainly state on its label
if its content has been copyright-protected, as well as the return policy
for the CD in the event that it does not play properly because of the
copyright-protection technology. The Consumer Electronics Association also
supports Boucher's bill. "We believe that the DMCA is overly broad," said
the association's Michael Petricone. "It's a major burden on legitimate
innovation and research that chills normal and customary consumer conduct."
Others argue that while the DMCA is imperfect, the stories of abuse are
vastly outnumbered by the millions of legal downloads that the DMCA has
helped protect against illegal copying.
Click Here to View Full Article
to the top
Collaboration Spurs Progress on Networking
Technologies
IST Results (04/13/06)
The E-NEXT project has been successful in getting European companies,
universities, and research institutions to collaborate on the development
of new technologies in mobile and ambient networking, self-aware and
service-aware networking, and content distribution. The virtual research
center is behind the launch of the European Doctoral School on Advanced
Topics in Networking (SATIN) to produce more computer networking
researchers, and the CoNEXT conference to engage researchers from the
United States and Asia on future networking technologies. E-NEXT technical
coordinator Arturo Azcorra views the project serving as a spark for
collaboration that makes self-sustainability and investment possible. "It
has long been evident that collaboration is profitable in the sense that
groups of researchers working together produce better results than a single
group of researchers working alone," says Azcorra. E-NEXT is set to end in
July, but will be followed up by CONTENT, a three-year initiative that
focuses on facilitating collaboration on content distribution networks,
peer-to-peer, and interactive multimedia. Considerable technological
improvements can be made as the demand for personalized media grows, says
Azcorra.
Click Here to View Full Article
to the top
Computing Project Targets Bird Flu
IDG News Service (04/13/06) Kirk, Jeremy
Approximately 80,000 people from around the world are contributing their
computer power via the Internet to assist the Rothberg Institute for
Childhood Diseases in the study of potential drug treatments for avian
influenza. Volunteers from 93 countries have downloaded a screen saver
application that simulates the binding of drug molecules with proteins, or
targets, in avian flu. The screen saver appears in a computer's program
tray, and the program begins when a computer is idle. The research
institute compares what the program does to searching through a batch of
keys, or drugs, for the one that fits a protein in the virus. The program
then sends the results back to the research institute. The researchers are
using the distributed computing approach to send new targets in minutes to
volunteers' computers when they are running the program, called the Drug
Design and Optimization Lab (D2OL). H5N1 neuraminidase, which officials
fear is most likely to spread from birds to humans, is the first avian flu
target of the research institute.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Does Every Vote Count?
San Antonio Express-News (TX) (04/09/06) Chapa, Rebeca
In the wake of recent contentious elections that ended up in a recount of
paper ballots, computer experts have been calling for a nationwide mandate
that would require all e-voting machines to produce a paper trail. "You
can't trust an election that's run with paperless machines," said Avi
Rubin, computer science professor at Johns Hopkins University. "There
isn't any way to recover the results." Currently, 25 states require their
voting machines to contain a voter-verified paper trail, though more are
having to wrestle with the issue as they race to purchase new equipment
under the 2002 Help America Vote Act. Rep. Rush Holt (D-N.J.) has
introduced legislation that would require every precinct to use machines
that produce a paper trail and each state to conduct unannounced audits of
2 percent of its jurisdictions. The U.S. General Accounting Office
released a report in September touting the potential of e-voting machines
to improve the election process, though it mentioned the numerous warnings
that have raised "concerns about their security and reliability." If
election results are contested, Rubin and Stanford computer scientists
David Dill argue that without a voter-verified paper trail, auditors will
only be able to reprint the ballots, which would simply reproduce the same
errors that the machines made on election day. Nevada has implemented
machines with voter-verifiable paper trails in each of its 17 counties, and
has met with positive feedback from voters. In Leon County, Fla.,
elections administrator Ion Sancho sparked controversy last year when he
invited security researchers to attempt to hack into the county's Diebold
machines. While the security experts succeeded in penetrating the system,
Diebold lashed out at Sancho, calling his tests "foolish and
irresponsible." With counties throughout the country scrambling to
implement new systems, vendors are also having difficulty keeping up with
demand.
To read USACM's recent report, "Statewide Databases of Registered Voters,"
visit
http://www.acm.org/usacm/VRD
Click Here to View Full Article
to the top
An Ever-Widening Web Is Reaching Out to Pull Us In
Baltimore Sun (04/16/06) P. 1F; Williams, Larry
Real-time and archived television streams and downloads, instant
messaging, blogging, and mobility are driving the Internet's expansion.
About 18,000 computer servers were used to stream games to viewers of CBS
SportsLine's free Web broadcasts of the NCAA Basketball Tournament.
Upwards of 102 billion Internet requests for Web content were made on the
first day of the tournament, peaking at 2.3 million requests per second.
Web users are already downloading shows from network archives and will soon
be able to watch free ad-driven programming. Instant messaging is quickly
evolving from the playland of the young to a useful corporate tool that
facilitates communications between workers and customers. Standard IM
systems now incorporate voice and video options. Content providers such as
Yahoo! and Google are utilizing blog capabilities to add members as well as
other functions, such as dating services, online data storage, and
financial help. Meanwhile, improved mobile online access is putting all
these tools in the hands of cell phone users.
Click Here to View Full Article
to the top
Lack of Communication From ICANN Could Prove Fatal
Computerworld New Zealand (04/12/06) Bell, Stephen
Longtime ICANN participant Vittorio Bertola of Italy claims that ICANN's
public board meetings are no longer serving their purpose of being a forum
for dialogue, and that ICANN may push people elsewhere if things continue.
Bertola says that, these days, the public portions of ICANN board meetings
consist mostly of reports, with only 5 percent of the time devoted to brief
comments by attendees. ICANN's proposed renewal of VeriSign's .com
contract also has rankled some Internet governance participants. "This
room [was] practically empty," says Bertola of the last ICANN public board
meeting in New Zealand. Bertola sees this empty room as a sign that people
are going elsewhere. ICANN business constituency participant Grant Forsyth
from New Zealand says ICANN has ignored public input on the VeriSign
contract. Forsyth says the business community has written to the U.S.
Department of Commerce about the contract. ISP advocate Tony Holmes agrees
that ICANN's board has not digested public input on this issue.
Click Here to View Full Article
to the top
MANIAC Challenge to Stimulate Student Experimentation in
Wireless Networking
Virginia Tech News (04/04/06) Nystrom, Lynn
NSF has awarded a three-year, $450,000 grant to two Virginia Tech
researchers to develop the Mobile Ad Hoc Networking Interoperability And
Cooperation Challenge (MANIAC) to encourage student interest in wireless
networking. The researchers looked at open competitions in software
engineering, robotics, and automotive design, and noted that no similar
contest exists in wireless networking. "These competitions are very
motivating, not to mention fun. Also, failure often teaches us more than
success, and implementation is always more convincing than simulation,"
said Luiz DaSilva, an associate professor of electrical and computer
engineering. "The kind of informal exchange of ideas that occurs naturally
in a competition like this tends to move research forward in unexpected
ways." The contest will pose students with the central challenges of the
industry, including the extent to which bandwidth, signal strength, or
speed should be compromised to ensure that the system is reliable and
effective. DaSilva and Allen MacKenzie, an assistant professor of
electrical and computer engineering, expect the contest to produce new
networking techniques and algorithms as students send data over ad hoc
networks. Entries will be evaluated on speed and efficiency. The 2007
contest will consist of a video and data relay race on the mobile ad hoc
network (MANET) where contestants will only be able to deliver traffic with
the help of others. The software will also contribute to researchers' body
of knowledge about ad hoc networks by monitoring the behavior of nodes and
system effectiveness, providing a real testbed in a field long dominated by
simulation. "There has recently been some soul searching by the networking
community regarding the prevailing use of simulation as the main research
methodology," DaSilva said. "This competition will provide researchers
with a unique opportunity to study real-life network behavior in the
wild."
Click Here to View Full Article
to the top
More Cash for the Labs?
Electronic Business (04/06) Vol. 32, No. 4, P. 14; Crotty, Cameron
With the announcement of the American Competitiveness Initiative (ACI) at
his State of the Union address in January, President Bush vindicated the
efforts of a long-frustrated technology lobby that had been trying for
years to elevate basic research to a top funding priority. Though the
announcement of a sweeping program to double government investment in basic
scientific research over 10 years was roundly welcomed, research advocates
will now turn their attention to the details as the spending bills appear
in Congress. Increased research funding enjoys bipartisan support, but
Daryl Hatano of the Semiconductor Industry Association noted that funding
this year is very limited. "The president talked about the need to hold in
discretionary spending, and Congress will have a lot of priorities as
well," he said. To support the basic, high-risk research that aims to
benefit the entire industry, rather than leading to a specific product, the
ACI calls for increases in funding for the NSF, the Department of Energy,
and the National Institute of Standards and Technology over 10 years, and
increasing the basic-research budget at the Department of Defense by about
8 percent in 2007. The presidential program joins the National Innovation
Act and the Protecting America's Competitive Edge (PACE) package of three
bills, two pending legislative initiatives in the Senate also aiming to
increase funding for research. Despite the renewed focus on innovation,
the American Association for the Advancement of Science reports that the
Bush administration's proposed 2007 budget only increases research funding
by 1.9 percent from 2006, not even enough to keep up with an expected
inflation increase 2.2 percent. Even if the numbers fall short of
expectations, the industry can take heart that most proposals have
addressed issues such as education and tax credits for research, in
addition to increasing government funding.
Click Here to View Full Article
to the top
Can Wireless Standards Work Together?
Sensors (04/06) Vol. 23, No. 4, P. 20; Fuhr, Peter; Kagan, Hesh
Current and future users of wireless technology must pick their way
through a quagmire of technologies, standards, and operating principles.
This scenario leads to speculation as to whether the wireless industrial
sector will soon experience a frenzy of activity similar to the industrial
bus wars at the close of the 20th century, or whether the switch to
wireless networking will be smooth through a deeper comprehension of
current technology and past experience. There are so many varieties of
wireless, each supported by corporate/marketing partnerships and vast
numbers of products, that their coexistence in the same frequency and
physical footprint is debatable. Two IEEE standards groups--the
Recommended Practice for Coexistence in Unlicensed Bands group and the
Coexistence of Wireless Data Transport group--are trying to tackle the
coexistence challenge. Meanwhile, the Instrumentation, Systems, and
Automation (ISA) Society is working on a reconciliation between wireless
standards and industrial deployment activities. The ISA's SP100 Committee,
whose responsibility is the delivery of functional wireless technology to
the industrial sector, has defined the key terms of coexistence,
interoperability, and interworking. Coexistence is termed as a system's
ability to execute a task in an environment where other systems that may or
may not be employing a similar set of rules are present; interoperability
is two systems' ability to perform a single task using one set of rules;
and interworking is the ability of two systems to carry out a task where
different rules apply to each system.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Supercomputing Is Here!
Campus Technology (04/06) Vol. 19, No. 8, P. 44; Villano, Matt
Indiana University, the University of Utah, Embry-Riddle Aeronautical
University, and the University of Florida show how the latest academic
supercomputing deployments are surpassing most people's expectations in
terms of computing power. Indiana University aims to revolutionize
dangerous weather forecasting so that governments can better prepare for
natural catastrophes and reduce casualties through the Linked Environments
for Atmospheric Discovery (LEAD) initiative, a National Science
Foundation-funded project that utilizes a grid computing methodology for
"building an adaptive, on-demand computer and network infrastructure that
responds to complex weather-driven events," says co-principal investigator
Dennis Gannon. Incoming weather data is interpreted by software agents
that study the data for specific hazardous patterns; once such patterns are
identified, the information is sent to numerous high-performance computers
across private networks for real-time processing and assessment.
Meanwhile, Embry-Riddle is using the new Beowulf cluster to speed up
various research projects, including those focusing on the study of
upper-atmospheric acoustic-gravity waves. Among the cluster's challenges
was the need for heavy code modification in order to explore Beowulf's
multiple processing capabilities. University of Florida researchers will
undertake advanced "multi-scale" climate modeling, molecular dynamics, and
aerodynamic engineering projects with a 200-node supercomputer at the High
Performance Center (HPC), which enlisted Cisco Systems to supply all the
internodal networking connections, as well as to help the university link
all of its on-campus clusters so HPC can carry out more grid-based
computations. A loose-coupling node architecture is used for grid
applications, while a tight-coupling architecture is employed in the UF
cluster. Finally, the University of Utah has acquired a metacluster to
support advanced bioinformatics applications, and it comes with a
"condominium"-style sub-cluster where extra capacity can be added.
Click Here to View Full Article
to the top
Extreme Computing
Redmond (04/06) Vol. 12, No. 4, P. 28; Desmond, Michael
General-purpose computers are being modified and redesigned to function
reliably in extremely inhospitable environments that range from deserts to
polar regions to outer space. Few people think ruggedized equipment alone
is sufficient, and they recommend the inclusion of redundant systems. For
example, a Special Forces team led by Master Sergeant Ben Thomas used
Panasonic Toughbooks during a nine-month stint in Afghanistan to
communicate, map out, and plan missions in dusty and dirty conditions, but
the team also carried stock radio equipment for communications in the most
volatile field operations. A key requirement for the use of
Hewlett-Packard's iPaq 1510 personal digital assistants (PDAs) aboard the
International Space Station was thorough testing and, when necessary,
alteration of the handheld to ensure safety. HP electrical engineer Scott
Briggs notes that the iPaq's dry capacitors--a necessity because of its
small size and low power consumption--eliminated concerns of liquid-filled
capacitors leaking in zero gravity, while the PDA's small electrical
circuits restricted the effects of gamma rays. Worries that a blow to the
iPaq would dislodge shards of glass so small they could be inhaled by the
astronauts were allayed by the addition of a commercially available
laminate screen cover. The threat of malware is just as real in hostile
environments as it is in less volatile ones, and measures people are taking
to reduce or eliminate this danger include preventing key systems from
accessing the Internet, and the deployment of remediation tools, layered
defenses, and knowledgeable personnel.
Click Here to View Full Article
to the top
Big Brother Is Listening
Atlantic Monthly (04/06) Vol. 297, No. 3, P. 65; Bamford, James
Technological advancements have widened the scope of National Security
Agency (NSA) surveillance, while the legal barriers to such eavesdropping
have been lowered with a White House mandate that permits the NSA to place
Americans on watch lists and monitor their communications without first
obtaining permission from the Foreign Intelligence Surveillance (FISA)
court. Previously a court order was required, and could only be secured if
the NSA showed that it had probable cause to eavesdrop on people suspected
of involvement with terrorist organizations. Now people can be placed on
watch lists by NSA shift supervisors who have a "reasonable belief" of
involvement, and the number of Americans targeted by the NSA has
consequently ballooned from perhaps 12 annually to 5,000 over the last four
years, according to sources. If innocent people are marked because they
fulfill these highly subjective criteria, they may be denied visas, federal
jobs, or other services and privileges without ever knowing why. The NSA's
surveillance methodology is signal intelligence, in which electronic
communications containing vast quantities of emails and phone calls are
intercepted and run through computers that flag specific words, phrases,
names, phone numbers, and Internet addresses, and forward these
communications to analysts. Also clearing the way for greater NSA
surveillance is the FCC's extension of the 1994 Communications Assistance
for Law Enforcement Act (CALEA) to cover "any type of broadband Internet
access service" and new Internet phone services, while the two
congressional intelligence committees tasked with protecting the public
from privacy abuses have abnegated their responsibilities. The NSA likes
to hire people away from providers of critical telecommunications system
components, offering them the opportunity to work with state-of-the-art
equipment and contribute to national security. Furthermore, a great deal
of the telecommunications industry secretly cooperates with the NSA in its
eavesdropping efforts.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top