git.net

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [users@httpd] Re: 0 length robot.txt


Hi Kremels,

you can check what virtualhost is being served via apache2ctl like this: $ apache2ctl -S
$ apache2ctl -h provides this info:
  -S                 : a synonym for -t -D DUMP_VHOSTS -D DUMP_RUN_CFG

After checking that the right vhost is being served, start removing proxy logic and just make the txt work again, then slowly start adding the proxy config to make the php work again.

If you can, post the full vhost here regarding the domain that misbehaves.

The important part is: Having a zeroed robots.txt doesn't break httpd.

On Wed, Oct 3, 2018 at 2:59 PM @lbutlr <kremels@xxxxxxxxx> wrote:
On 03 Oct 2018, at 11:39, @lbutlr <kremels@xxxxxxxxx> wrote:
> Removing that file made the site load properly.

Well, it did for about 3h25 minutes, in fact.

Just after posting the message, the site went back to showing only “File Not Found”

I’m at a loss.

The only other issue I see is in the main http-error log there are repeated instance of:

[ssl:info] [pid 43234] (70014)End of file found: [client 106.45.1.92:48564] AH01991: SSL input filter read failed.

(From various client addresses)

The site in question gets a grade of A+ from SSL Labs, and this error message appears to be somewhat spurious in nature as apache tries to use the default cert for the site before getting the server name, then loads the correct cert, so I don’t think this is really an issue.

--
Han : This is not going to work.
Luke: Why didn't you say so before?
Han : I did say so before!


---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx



--
[ ]'s

Filipe Cifali Stangler