Twelve years after it was discovered and not addressed, AMAZING

In summary: DNS up to date.In summary, the flaw in the way the Internet works has not been fixed, and despite the fact that 50,000 people die each year on the roads, the administration has not prioritized fixing the flaw.
  • #1
rhody
Gold Member
681
3
http://news.yahoo.com/s/ap/20100508/ap_on_hi_te/us_tec_fragile_internet"
By PETER SVENSSON, AP Technology Writer Peter Svensson, Ap Technology Writer – Sat May 8, 10:33 am ET

NEW YORK – In 1998, a hacker told Congress that he could bring down the Internet in 30 minutes by exploiting a certain flaw that sometimes caused online outages by misdirecting data. In 2003, the Bush administration concluded that fixing this flaw was in the nation's "vital interest."

Fast forward to 2010, and very little has happened to improve the situation. The flaw still causes outages every year. Although most of the outages are innocent and fixed quickly, the problem still could be exploited by a hacker to spy on data traffic or take down websites. Meanwhile, our reliance on the Internet has only increased. The next outage, accidental or malicious, could disrupt businesses, the government or anyone who needs the Internet to run normally.
and
Pieter Poll, the chief technology officer at Qwest Communications International Inc., says that he would support some simple mechanisms to validate data routes, but he argues that fundamental reform isn't necessary. Hijackings are typically corrected quickly enough that they don't pose a major threat, he argues.

One fix being tested would stop short of making the routing system fully secure but would at least verify part of it. Yet this system also worries carriers because they would have to work through a central database.

"My fear is that innovation on the Internet would slow down if there's a need to go through a central authority," Poll says. "I see little appetite for that in the industry."

Jeffrey Hunker, a former senior director for critical infrastructure in the Clinton administration, says he's not surprised that little has happened on the issue since 2003. He doesn't expect much to happen in the next seven years, either.

"The only thing that's going to drive adoption is a major incident, which we haven't had yet," he says. "But there's plenty of evidence out there that a major incident would be possible."

In the meantime, network administrators deal with hijacking an old-fashioned way: calling their counterparts close to where the hijacking is happening to get them to manually change data routes. Because e-mails may not arrive if a route has been hijacked, the phone is a more reliable option, says Tom Daly, chief technical officer of Dynamic Network Services Inc., which provides Web hosting and other Internet services.

"You make some phone calls and hope and pray," Daly says. "That's about it."

Like the huge oil spill recently with its terrible consequences, not until someone by accident or intentionally through sabotoge brings the internet to it's knees will anyone pay attention to it. We have seen the enemy and it is us... We never seem to learn...

Rhody...
 
Last edited by a moderator:
Computer science news on Phys.org
  • #2
One implementation of DNS is more susceptible to cache poisoning, you have a one in 65535 chance of guessing the secret key to send a false update request instead of the one in 4billion chance with all the other implementations.

50,000 people die each year on the roads - fix that first.
 
  • #3
mgb_phys said:
One implementation of DNS is more susceptible to cache poisoning, you have a one in 65535 chance of guessing the secret key to send a false update request instead of the one in 4 billion chance with all the other implementations.

50,000 people die each year on the roads - fix that first.

mgb_phys,

Fair enough, how common is that one implementation of DNS that could be cracked with a one in 65535 chance in happening ? In essense what you are saying is the buffer is only 2^16 -1 bits long.

What you say is true, but with 136.98 people being killed each day (non-leap year), how does that compare to the effect of not having an internet for the entire planet ? I can imagine disruption in communication alone could result in accidents killing more than that number during the outage period. The unintended consequences of losing all internet traffic for a substantial period raises the odds drastically for accidents that normally would not happen with a potential for killing more than 136.98 people in a single day. :wink:

Rhody... :devil:
 
Last edited:
  • #4
DNS like most of the internet doesn't have a lot of security.
When the internet was invented the presence of too many bad people wasn't really considered.
There are good aspects of this, physicsforums can setup a website without a government license and you can post without having to register an official account with some ministry. But there are problems - like spam - because anyone can send an email anonymously.

One feature of the no central authority is DNS, 20years ago we used to have huge lists mapping names to ip addresses, and we used to regularly have to mail updates to each other, this worked when only a few 100 universities where on the net.
Today there is a distributed tree of nameservers, each informs the next layer up of the mapping name-address. In this way you don't need a central global mega server, but the protocol for changing a mapping isn't terribly secure. There is no hard authentication - just a simple PIN code (16bits in one implmentation) that you have permission to change this address - see http://www.schneier.com/blog/archives/2008/07/the_dns_vulnera.html for more details.

This doesn't affect https which solves this problem in a different way, it has been fixed in all DNS implementation and it isn't the terrorism Armageddon that the news article claimed.
 
  • #5
mgb_phys said:
DNS like most of the internet doesn't have a lot of security.
When the internet was invented the presence of too many bad people wasn't really considered.
There are good aspects of this, physicsforums can setup a website without a government license and you can post without having to register an official account with some ministry. But there are problems - like spam - because anyone can send an email anonymously.

One feature of the no central authority is DNS, 20years ago we used to have huge lists mapping names to ip addresses, and we used to regularly have to mail updates to each other, this worked when only a few 100 universities where on the net.
Today there is a distributed tree of nameservers, each informs the next layer up of the mapping name-address. In this way you don't need a central global mega server, but the protocol for changing a mapping isn't terribly secure. There is no hard authentication - just a simple PIN code (16bits in one implmentation) that you have permission to change this address - see http://www.schneier.com/blog/archives/2008/07/the_dns_vulnera.html for more details.

This doesn't affect https which solves this problem in a different way, it has been fixed in all DNS implementation and it isn't the terrorism Armageddon that the news article claimed.

mgb_phys,

Thanks for the link, I am familiar with Bruce Schneier. I used to get a forwarded copy of his security news letter from a friend, always interesting security stuff to read.

Like the stock market scare last week that started with a large Protor and Gamble trade late in the day, I am more concerned that the entire web infrastructure will be taken down by accident.

I have been in the software industry over 30 years and in my own area of work, have seen show stoppers pop up in software and or software and hardware that has been tested and placed in the field after 7 revisions of rigorous testing, internal peer review, and independent peer review.

That being said, these are large distributed systems with tens of millions of lines of code in them, as well as modernization hardware changes. Such complexity by its very nature leads to potential problems when they come together as a "system". Testing everything for 100% code coverage (as in Space Shuttle software) is impossible. Problems are expected the longer the 'system" is being continuously used. Hopefully not life threatening ones...

rhody...

P.S. Some of the safety related systems we work with have 4 backup, 3 cross check one another, and 2 degraded modes of operation, and we still have problems with each new release of software and hardware.
 
Last edited:
  • #6
The internet is comprised of thousands of unrelated, independant networks that just "hand off" traffic at peering points. The internet is not a "network" it is many networks, and traffic can be blocked from any other network.

The very rare times that one network has affected another is where there were router errors that caused broadcast loops to be generated to another network that had agreed to take their overflow traffic. This hasn't happened in ages. It could have easily have been prevented if the routing tables had been verified before the new edge routers had been brought online.
 

1. What is the significance of the discovery of AMAZING after twelve years?

The discovery of AMAZING after twelve years is significant because it highlights the potential consequences of neglecting important scientific findings. It also raises questions about the effectiveness of the scientific community in addressing and communicating important discoveries.

2. How was AMAZING discovered?

The details of how AMAZING was discovered may vary depending on the specific case, but it is likely that it was through a combination of observations, experiments, and data analysis. Scientists may have used various techniques and technologies to observe and study the phenomenon.

3. Why wasn't AMAZING addressed immediately after its discovery?

There could be several reasons why AMAZING was not addressed immediately after its discovery. It could be due to a lack of understanding of its significance, limited resources, or competing priorities. It is also possible that the discovery was not widely communicated or recognized by the scientific community.

4. What are the potential implications of the delayed addressing of AMAZING?

The delayed addressing of AMAZING could have various implications, depending on the nature of the discovery. It could mean missed opportunities for further research and advancements in the field. It could also have negative consequences, such as environmental or health impacts, if the discovery is related to a potentially harmful phenomenon.

5. How can we prevent similar situations from happening in the future?

To prevent similar situations from happening in the future, it is important for scientists to prioritize and effectively communicate important discoveries. Collaborative efforts and open communication within the scientific community can also help ensure that important findings are not overlooked or neglected.

Similar threads

Replies
10
Views
2K
  • Computing and Technology
Replies
2
Views
4K
  • Sci-Fi Writing and World Building
Replies
15
Views
3K
  • Atomic and Condensed Matter
Replies
4
Views
6K
Replies
3
Views
3K
  • General Discussion
Replies
11
Views
25K
Back
Top