How do websites block unauthorized requests?

In summary: POST passes data in the request body, and is for things which only need to happen once (generally creating content). POST has the further advantage that the body of the request can be encrypted, whereas you can't really encrypt a URL.In summary, POST passes data in the request body, and is for things which only need to happen once (generally creating content).
  • #1
Trollfaz
137
14
If I say run the httr library on R and send DELETE or PUT or POST requests to a website, can it alter the contents and display of the webpage or do we devs have methods to block requests from unidentified sources. E.g if I send a DELETE request to Physics Forums website does it mean I can take down the logo?
 
Technology news on Phys.org
  • #2
Websites would normally have user-based permissions defining what a user can see or do. Giving any user permissions like that would not happen unless there was a bug in the code.
 
  • Like
Likes russ_watters
  • #3
The admin person(s) have a username (NEVER admin) and a password to get access to the site's inner workings, be it a forum or website or the content management system if a site uses one. In the common case of hacking into WordPress there are security plugins that can help secure the site. One of the very first things you do is rename the standard admin login page to something completely different. The average attack starts with trying the standard login page, so that simple change is a great start. Beginners get hacked because they don't know how to hide the admin login page.

Then editing things requires that the changes come from the admin panel and not from an external site. This is usually built in with a session token, so if by luck you find a form used by the site owner to edit things, if you don't have the session ID to prove you have logged in, the data in such a form goes nowhere. This is even used in home built online sites that the admin created themselves. Usually the lack of the token means any such page with a data enter form is simply not displayed for you to try to use.

As there are multiple ways to attack a site, multiple blocks exist within a WP security package to disable the ones most likely to take advantage of a weak feature of WP. The same is true for lots of other content management systems. Shopping carts can also have a security package installed.

And as these types of sites use a database, you have to know its name to send data to it.
 
  • #4
There also something called a ".htaccess" file which can do the job to some extent.
 
  • #5
  • #6
Trollfaz said:
if I send a DELETE request to Physics Forums website does it mean I can take down the logo?
No, because PF's server doesn't even respond to DELETE requests at all. In fact that's how most websites "defend" against such things: they don't even respond to any HTTP methods other than GET (and HEAD, which just returns the same headers as GET but no payload) and POST. And what the server does with requests with those methods depends on whether you are logged in or not and, if you are logged in, what permissions your user account has.
 
  • #8
PeterDonis said:
No, because PF's server doesn't even respond to DELETE requests at all. In fact that's how most websites "defend" against such things: they don't even respond to any HTTP methods other than GET (and HEAD, which just returns the same headers as GET but no payload) and POST. And what the server does with requests with those methods depends on whether you are logged in or not and, if you are logged in, what permissions your user account has.
So POST in the case of posting threads as a registered member?
 
  • #9
Trollfaz said:
So POST in the case of posting threads as a registered member?
POST would be anything that is an attempt to add content, so starting a new thread, posting in an existing thread, or posting a private message.
 
  • #10
Or filling out a registration form as a non-registered user.
 
  • #11
Borg said:
Or filling out a registration form as a non-registered user.
Yes, or signing in as a registered user.
 
  • #12
GET passes data in the URL, and is for things which could happen multiple times (generally fetching content). POST passes data in the request body, and is for things which only need to happen once (generally creating content). POST has the further advantage that the body of the request can be encrypted, whereas you can't really encrypt a URL.
 
  • #13
pasmith said:
GET passes data in the URL
More precisely, it passes things like query parameters in the URL. A GET request cannot change any documents on the server side, so nothing in the URL is "data" the way, say, a form body in a POST request is data.

pasmith said:
is for things which could happen multiple times (generally fetching content). POST passes data in the request body, and is for things which only need to happen once
More precisely, multiple GET requests with the same URL and headers are idempotent (they have the same effect as if just one of them was made). But multiple POST requests with identical content (URL, headers, and payload) are not idempotent--they could create multiple copies of the same document, for example, instead of just one.
 

1. How do websites block unauthorized requests?

Websites use various methods to block unauthorized requests, such as IP blocking, user authentication, and captcha verification.

2. What is IP blocking and how does it work?

IP blocking is a method where websites block access to certain IP addresses that have been flagged as suspicious or malicious. This is done by maintaining a blacklist of IP addresses and denying access to those addresses when they attempt to make a request to the website.

3. How does user authentication prevent unauthorized requests?

User authentication requires users to provide login credentials, such as a username and password, to access the website. This ensures that only authorized users can make requests to the website.

4. What is captcha verification and how does it prevent unauthorized requests?

Captcha verification is a security measure that requires users to complete a task, such as identifying distorted letters or images, to prove that they are human. This prevents automated bots from making unauthorized requests to the website.

5. Can websites still be vulnerable to unauthorized requests even with these methods in place?

Yes, websites can still be vulnerable to unauthorized requests if the methods used are not properly implemented or if there are other vulnerabilities in the website's code. It is important for websites to regularly update and maintain their security measures to prevent unauthorized access.

Similar threads

  • Programming and Computer Science
Replies
4
Views
342
  • Programming and Computer Science
Replies
15
Views
1K
  • Programming and Computer Science
Replies
4
Views
3K
  • Programming and Computer Science
Replies
17
Views
1K
  • Programming and Computer Science
Replies
14
Views
2K
Replies
7
Views
243
  • Programming and Computer Science
Replies
3
Views
1K
  • Programming and Computer Science
Replies
6
Views
1K
  • Programming and Computer Science
Replies
7
Views
5K
  • Programming and Computer Science
Replies
1
Views
1K
Back
Top