I'm thinking a little differently here. Something that causes problems quite often for users is when they try to use multiple programs (browsers, auto-whatevers, etc) to access the same neopets account. So, here's a new program request to help correct those problems.
What I want is a localhost proxy server that modifies the headers on all http requests sent to *.neopets.com. It would have to monitor and track cookies in replies and change them on the fly in subsequent requests.
So basically, it needs to do the following:
Code:
-Listen on a configurable local IP/port for http requests, modify them, and forward them to the http server.
-Be able to forward through a configurable external proxy IP:port.
Fix headers from poorly coded program requests:
-Strip pre-expired cookies from server replies so clients never see them and poorly coded ones never send them back, this helps prevent detection.
-Ignore 'null' or malformed headers from client requests, stuff in headers like "Cookie: " and "Referer: " with no data, this also helps prevent detection.
Monitor reply headers:
-Extract Set-Cookie: headers from http replies and track them with proper domain and expire data.
-Delete expired cookies and never send them in requests.
Modify request headers before sending them to neopets:
-Change "Cookie:" header in requests to those currently tracked by the proxy.
-Change other headers to masquerade profile to appear as browser.
Standard proxy behavior:
-Change the request url in the GET/PUT/etc header to / base instead of host base. (i.e. "GET /www.neopets.com/index.phtml HTTP/1.1" becomes "GET /index.phtml HTTP/1.1")
-Change the host from the proxy to address to host in the url. (i.e. "Host: 127.0.0.1" becomes "Host: www.neopets.com")
Header masquerading:
-Localhost proxy should use the first connection to auto-configure a masquerade profile. This way the user can first connect with his browser through the localhost proxy and log into neopets, then the proxy extracts the User-Agent:, Accept-Encoding:, Accept:, Keep-Alive:, and various other static headers used by the browser, and saves them to modify subsequent requests.
-UI should have dialog to manually (re)configure masquerade profile if needed.
If we had a program like this, we could use multiple programs at once on the same account without worrying about one of them getting logged out and getting flagged for having multiple clients being tracked at the same time, thus minimizing the risk of using programs while at the same time increasing our ability to use the site.
Having the ability to send all requests through an external proxy would allow the user to configure his browser/programs to use this localhost proxy, relying on this program to forward on their programs requests through an external proxy server to hide the IP he's using.
If the user finds himself wanting to using multiple accounts in his program(s), he could just run multiple instances of this proxy and configure each instance with a different port on which to listen, then configure his program to use that localIP:port as the proxy for that account. This would allow a separate proxy instance for each account so cookies and masquerade data would never get confused between accounts.
Personally I think this is a grand idea. What do you think?