Twelve years after it was discovered and not addressed, AMAZING

  • Thread starter Thread starter rhody
  • Start date Start date
  • Tags Tags
    Web hosting Years
AI Thread Summary
The discussion highlights ongoing concerns about the fragility of the Internet, particularly regarding vulnerabilities in data routing that could lead to outages or security breaches. Despite warnings dating back to 1998 about a flaw that could be exploited by hackers, little progress has been made to secure the Internet's infrastructure, with experts suggesting that significant changes are unlikely without a major incident. Current methods for addressing routing hijacks are largely manual and reactive, relying on direct communication between network administrators. Some proposed solutions, like validating data routes, face resistance from industry leaders who fear that centralizing control could stifle innovation. The conversation also touches on the inherent security weaknesses of the Domain Name System (DNS), which lacks robust authentication measures, raising concerns about potential exploitation. Overall, the discussion underscores a critical need for improved security measures in an increasingly interconnected world, while acknowledging the complexities and challenges of implementing such changes.
rhody
Gold Member
Messages
679
Reaction score
3
http://news.yahoo.com/s/ap/20100508/ap_on_hi_te/us_tec_fragile_internet"
By PETER SVENSSON, AP Technology Writer Peter Svensson, Ap Technology Writer – Sat May 8, 10:33 am ET

NEW YORK – In 1998, a hacker told Congress that he could bring down the Internet in 30 minutes by exploiting a certain flaw that sometimes caused online outages by misdirecting data. In 2003, the Bush administration concluded that fixing this flaw was in the nation's "vital interest."

Fast forward to 2010, and very little has happened to improve the situation. The flaw still causes outages every year. Although most of the outages are innocent and fixed quickly, the problem still could be exploited by a hacker to spy on data traffic or take down websites. Meanwhile, our reliance on the Internet has only increased. The next outage, accidental or malicious, could disrupt businesses, the government or anyone who needs the Internet to run normally.
and
Pieter Poll, the chief technology officer at Qwest Communications International Inc., says that he would support some simple mechanisms to validate data routes, but he argues that fundamental reform isn't necessary. Hijackings are typically corrected quickly enough that they don't pose a major threat, he argues.

One fix being tested would stop short of making the routing system fully secure but would at least verify part of it. Yet this system also worries carriers because they would have to work through a central database.

"My fear is that innovation on the Internet would slow down if there's a need to go through a central authority," Poll says. "I see little appetite for that in the industry."

Jeffrey Hunker, a former senior director for critical infrastructure in the Clinton administration, says he's not surprised that little has happened on the issue since 2003. He doesn't expect much to happen in the next seven years, either.

"The only thing that's going to drive adoption is a major incident, which we haven't had yet," he says. "But there's plenty of evidence out there that a major incident would be possible."

In the meantime, network administrators deal with hijacking an old-fashioned way: calling their counterparts close to where the hijacking is happening to get them to manually change data routes. Because e-mails may not arrive if a route has been hijacked, the phone is a more reliable option, says Tom Daly, chief technical officer of Dynamic Network Services Inc., which provides Web hosting and other Internet services.

"You make some phone calls and hope and pray," Daly says. "That's about it."

Like the huge oil spill recently with its terrible consequences, not until someone by accident or intentionally through sabotoge brings the internet to it's knees will anyone pay attention to it. We have seen the enemy and it is us... We never seem to learn...

Rhody...
 
Last edited by a moderator:
Computer science news on Phys.org
One implementation of DNS is more susceptible to cache poisoning, you have a one in 65535 chance of guessing the secret key to send a false update request instead of the one in 4billion chance with all the other implementations.

50,000 people die each year on the roads - fix that first.
 
mgb_phys said:
One implementation of DNS is more susceptible to cache poisoning, you have a one in 65535 chance of guessing the secret key to send a false update request instead of the one in 4 billion chance with all the other implementations.

50,000 people die each year on the roads - fix that first.

mgb_phys,

Fair enough, how common is that one implementation of DNS that could be cracked with a one in 65535 chance in happening ? In essense what you are saying is the buffer is only 2^16 -1 bits long.

What you say is true, but with 136.98 people being killed each day (non-leap year), how does that compare to the effect of not having an internet for the entire planet ? I can imagine disruption in communication alone could result in accidents killing more than that number during the outage period. The unintended consequences of losing all internet traffic for a substantial period raises the odds drastically for accidents that normally would not happen with a potential for killing more than 136.98 people in a single day. :wink:

Rhody... :devil:
 
Last edited:
DNS like most of the internet doesn't have a lot of security.
When the internet was invented the presence of too many bad people wasn't really considered.
There are good aspects of this, physicsforums can setup a website without a government license and you can post without having to register an official account with some ministry. But there are problems - like spam - because anyone can send an email anonymously.

One feature of the no central authority is DNS, 20years ago we used to have huge lists mapping names to ip addresses, and we used to regularly have to mail updates to each other, this worked when only a few 100 universities where on the net.
Today there is a distributed tree of nameservers, each informs the next layer up of the mapping name-address. In this way you don't need a central global mega server, but the protocol for changing a mapping isn't terribly secure. There is no hard authentication - just a simple PIN code (16bits in one implmentation) that you have permission to change this address - see http://www.schneier.com/blog/archives/2008/07/the_dns_vulnera.html for more details.

This doesn't affect https which solves this problem in a different way, it has been fixed in all DNS implementation and it isn't the terrorism Armageddon that the news article claimed.
 
mgb_phys said:
DNS like most of the internet doesn't have a lot of security.
When the internet was invented the presence of too many bad people wasn't really considered.
There are good aspects of this, physicsforums can setup a website without a government license and you can post without having to register an official account with some ministry. But there are problems - like spam - because anyone can send an email anonymously.

One feature of the no central authority is DNS, 20years ago we used to have huge lists mapping names to ip addresses, and we used to regularly have to mail updates to each other, this worked when only a few 100 universities where on the net.
Today there is a distributed tree of nameservers, each informs the next layer up of the mapping name-address. In this way you don't need a central global mega server, but the protocol for changing a mapping isn't terribly secure. There is no hard authentication - just a simple PIN code (16bits in one implmentation) that you have permission to change this address - see http://www.schneier.com/blog/archives/2008/07/the_dns_vulnera.html for more details.

This doesn't affect https which solves this problem in a different way, it has been fixed in all DNS implementation and it isn't the terrorism Armageddon that the news article claimed.

mgb_phys,

Thanks for the link, I am familiar with Bruce Schneier. I used to get a forwarded copy of his security news letter from a friend, always interesting security stuff to read.

Like the stock market scare last week that started with a large Protor and Gamble trade late in the day, I am more concerned that the entire web infrastructure will be taken down by accident.

I have been in the software industry over 30 years and in my own area of work, have seen show stoppers pop up in software and or software and hardware that has been tested and placed in the field after 7 revisions of rigorous testing, internal peer review, and independent peer review.

That being said, these are large distributed systems with tens of millions of lines of code in them, as well as modernization hardware changes. Such complexity by its very nature leads to potential problems when they come together as a "system". Testing everything for 100% code coverage (as in Space Shuttle software) is impossible. Problems are expected the longer the 'system" is being continuously used. Hopefully not life threatening ones...

rhody...

P.S. Some of the safety related systems we work with have 4 backup, 3 cross check one another, and 2 degraded modes of operation, and we still have problems with each new release of software and hardware.
 
Last edited:
The internet is comprised of thousands of unrelated, independant networks that just "hand off" traffic at peering points. The internet is not a "network" it is many networks, and traffic can be blocked from any other network.

The very rare times that one network has affected another is where there were router errors that caused broadcast loops to be generated to another network that had agreed to take their overflow traffic. This hasn't happened in ages. It could have easily have been prevented if the routing tables had been verified before the new edge routers had been brought online.
 
In my discussions elsewhere, I've noticed a lot of disagreement regarding AI. A question that comes up is, "Is AI hype?" Unfortunately, when this question is asked, the one asking, as far as I can tell, may mean one of three things which can lead to lots of confusion. I'll list them out now for clarity. 1. Can AI do everything a human can do and how close are we to that? 2. Are corporations and governments using the promise of AI to gain more power for themselves? 3. Are AI and transhumans...
Thread 'ChatGPT Examples, Good and Bad'
I've been experimenting with ChatGPT. Some results are good, some very very bad. I think examples can help expose the properties of this AI. Maybe you can post some of your favorite examples and tell us what they reveal about the properties of this AI. (I had problems with copy/paste of text and formatting, so I'm posting my examples as screen shots. That is a promising start. :smile: But then I provided values V=1, R1=1, R2=2, R3=3 and asked for the value of I. At first, it said...
Back
Top