Wednesday, September 19, 2012

Setting Up A Server Somewhere Else

HostRocket is having trouble figuring out why HTTP request logging for my web pages isn't working.  (Yes, I find myself mystified why a server company is finding this so darn difficult.)

I really don't want to hassle with setting up Apache behind a firewall to see the requests being sent.  But perhaps if HostRocket can't figure out how to enable HTTP logging by tomorrow, I may not have any choice.

UPDATE: HostRocket has logging working.  I can see the GET requests--but my ksh CGI script doesn't work the same when called by their web page as when I feed the logged GET request directly into it.  I'm guessing that there is something about a ksh script that runs differently as a CGI from when run on the server.  I can see why this is a demanding test--you have to work around lots of obstacles doing something that is a bit unusual.

I am increasingly mystified by this.  I'm not sure if something needs to be turned on to run CGI scripts or what.  The only example that I can find of a ksh CGI script starts out like this:

#!/bin/ksh
PARM_PASSED=$(< /dev/stdin)

This works from the shell, but not when sending a request of the form

http://www.claytoncramer.com/cgi-bin/balihoo.cgi?q=Ping&d=Please+return+OK+so%20HTTP/1.0

Nothing comes through.  Any ideas?  I took the day off work to get a number of things working--including what should be a trivial task like this.

It is becoming increasingly apparent that this simply doesn't work in a CGI setting.  I need to interrogate QUERY_STRING instead.

UPDATE 2: Yup, don't believe everything you find on the Internet, I tell my students, and that's true for technical solutions too.  I'm getting there.

UPDATE 3: There seem to be some quirks built-in their webpage specifically to make it inconsistent and hard.  For example, the first GET request is named Ping and expects you to return OK--but it there can't be a trailing line feed.  I've determined this with a trivial script named http://www.claytoncramer.com/cgi-bin/ok.cgi that is as simple as it gets:

#!/bin/ksh
print "Content-type: text/plain"
print ""
print -n "OK"

This gets past the first GET request, and sends the request of the GET requests through.  I can see all of them in my log file, and I can feed them into the actual server script--and they work perfectly.  However: if I give that server script http://www.claytoncramer.com/cgi-bin/balihoo.cgi to their test page, it does not get past that first Ping, even though it produces exactly the same response:

Content-type: text/plain
OK(with no closing line feed).

Instead, it complains that it was expecting OK but got ".

The period after the quote is probably a line feed.  I am temporarily stumped by this: a trivial server script at least gets me past the first GET, but another one that produces the same output from the same input does not work.  All I can think is that they have something that is timing dependent.  Perhaps I will put in a sleep into the script to prevent it from running too fast.  I'm grasping at straws here, because I just don't see what else could cause the same data sent back to produce such different results.

UPDATE 4: I can reproduce the problem on my own by using http://www.claytoncramer.com/cgi-bin/balihoo.cgi?q=Ping&d=Please+return+OK+so+that+I+Know+your+service+works as the request. which returns Internal Server Error, while http://www.claytoncramer.com/cgi-bin/ok.cgi?q=Ping&d=Please+return+OK+so+that+I+Know+your+service+works. actually works.  Perhaps I will be able to find the server error log.

UPDATE 5: So much work just to submit a resume to a company that probably won't bring me in for an interview.  It turns out that when I try to run the full server script, I get an error in the server log:

ModSecurity: Audit log: Failed to create subdirectories: /var/asl/data/audit/20120920/20120920-1531 (Permission denied)

I am guessing that CGI scripts require some permission turned on to run on HostRocket's servers, and I may not have that permission.  My script may actually work--but not with how the server it is running on is configured.

One of the theories behind making it so difficult to submit a resume is that an employer only wants to hire the best of the best--those with both the technical skills and patience to fight their way through all these sort of questions.  (The position for which I am applying is Java developer--not server guru.)  Doubtless, this prevents a number of less than brilliant sorts form applying--but I find myself wondering if applicants who have ten positions to which they can apply might look at the amount of work involved applying for this position and decide that they have better things to do with a day than apply for one position.  Perhaps Balihoo is actually getting not the best of the best, but just those members of the "best of the best" who don't have other options.  They may be getting not the "best of the best of the best" but "the worst of the best of the best."

No comments: