Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Twelve years after it was discovered and not addressed, AMAZING

Tags:
  1. May 12, 2010 #1

    rhody

    User Avatar
    Gold Member

    http://news.yahoo.com/s/ap/20100508/ap_on_hi_te/us_tec_fragile_internet" [Broken]
    By PETER SVENSSON, AP Technology Writer Peter Svensson, Ap Technology Writer – Sat May 8, 10:33 am ET

    and
    Like the huge oil spill recently with its terrible consequences, not until someone by accident or intentionally through sabotoge brings the internet to it's knees will anyone pay attention to it. We have seen the enemy and it is us... We never seem to learn...

    Rhody...
     
    Last edited by a moderator: May 4, 2017
  2. jcsd
  3. May 12, 2010 #2

    mgb_phys

    User Avatar
    Science Advisor
    Homework Helper

    One implementation of DNS is more susceptible to cache poisoning, you have a one in 65535 chance of guessing the secret key to send a false update request instead of the one in 4billion chance with all the other implementations.

    50,000 people die each year on the roads - fix that first.
     
  4. May 12, 2010 #3

    rhody

    User Avatar
    Gold Member

    mgb_phys,

    Fair enough, how common is that one implementation of DNS that could be cracked with a one in 65535 chance in happening ? In essense what you are saying is the buffer is only 2^16 -1 bits long.

    What you say is true, but with 136.98 people being killed each day (non-leap year), how does that compare to the effect of not having an internet for the entire planet ? I can imagine disruption in communication alone could result in accidents killing more than that number during the outage period. The unintended consequences of losing all internet traffic for a substantial period raises the odds drastically for accidents that normally would not happen with a potential for killing more than 136.98 people in a single day. :wink:

    Rhody... :devil:
     
    Last edited: May 12, 2010
  5. May 12, 2010 #4

    mgb_phys

    User Avatar
    Science Advisor
    Homework Helper

    DNS like most of the internet doesn't have a lot of security.
    When the internet was invented the presence of too many bad people wasn't really considered.
    There are good aspects of this, physicsforums can setup a web site without a government license and you can post without having to register an official account with some ministry. But there are problems - like spam - because anyone can send an email anonymously.

    One feature of the no central authority is DNS, 20years ago we used to have huge lists mapping names to ip addresses, and we used to regularly have to mail updates to each other, this worked when only a few 100 universities where on the net.
    Today there is a distributed tree of nameservers, each informs the next layer up of the mapping name-address. In this way you don't need a central global mega server, but the protocol for changing a mapping isn't terribly secure. There is no hard authentication - just a simple PIN code (16bits in one implmentation) that you have permission to change this address - see http://www.schneier.com/blog/archives/2008/07/the_dns_vulnera.html for more details.

    This doesn't affect https which solves this problem in a different way, it has been fixed in all DNS implementation and it isn't the terrorism Armageddon that the news article claimed.
     
  6. May 12, 2010 #5

    rhody

    User Avatar
    Gold Member

    mgb_phys,

    Thanks for the link, I am familiar with Bruce Schneier. I used to get a forwarded copy of his security news letter from a friend, always interesting security stuff to read.

    Like the stock market scare last week that started with a large Protor and Gamble trade late in the day, I am more concerned that the entire web infrastructure will be taken down by accident.

    I have been in the software industry over 30 years and in my own area of work, have seen show stoppers pop up in software and or software and hardware that has been tested and placed in the field after 7 revisions of rigorous testing, internal peer review, and independent peer review.

    That being said, these are large distributed systems with tens of millions of lines of code in them, as well as modernization hardware changes. Such complexity by its very nature leads to potential problems when they come together as a "system". Testing everything for 100% code coverage (as in Space Shuttle software) is impossible. Problems are expected the longer the 'system" is being continuously used. Hopefully not life threatening ones...

    rhody...

    P.S. Some of the safety related systems we work with have 4 backup, 3 cross check one another, and 2 degraded modes of operation, and we still have problems with each new release of software and hardware.
     
    Last edited: May 12, 2010
  7. May 12, 2010 #6

    Evo

    User Avatar

    Staff: Mentor

    The internet is comprised of thousands of unrelated, independant networks that just "hand off" traffic at peering points. The internet is not a "network" it is many networks, and traffic can be blocked from any other network.

    The very rare times that one network has affected another is where there were router errors that caused broadcast loops to be generated to another network that had agreed to take their overflow traffic. This hasn't happened in ages. It could have easily have been prevented if the routing tables had been verified before the new edge routers had been brought online.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Twelve years after it was discovered and not addressed, AMAZING
  1. IP address (Replies: 4)

  2. LAN address (Replies: 9)

Loading...