Server :: HTAccess - Auto Index Not Showing Some Directories
Oct 5, 2010
I have a VERY simple setup, or so I think. 1 VirtualHost that it's only purpose is to list some files and directories, now under the DocuemntRoot there is 1 named clients. Insdide clients there are 4 folders, you guessed it, one for each client. I setup simple .htaccess for each one of these folders so each one has it's own auth.
Now on the vHost I have this:
Code:
<VirtualHost *:443>
ServerName vpn.domain.com
ServerAlias vpn.domain.lan
ServerAlias vpn
DocumentRoot /var/www/vpn
#ErrorLog /var/log/apache2/http-vpn.log
#LogLevel error .....
Now the weird part, with the AllowOverride AuthConfig those 4 directories for each client are not being displayed, if I set AllowOverride all, then the directories inside clients are not listed, if I set it to none.... then the server lists them but ignores the .htaccess.
I have a site hosted with a cheap hosting company and limited control of the site. I need to allow some other users all over the world to have write access to files or complete directories and I have no idea how to do that. Initially I thought I could use "chown" somehow but it looks like it's a no-go with ftp and others. By default, there is a .htaccess file and a .htpasswd file in the root directory of the site and the hosting company suggested to use .htaccess file with something like below:
[Code]....
and put it in a .htaccess file in the directory of user1 but the server does not like something since I inserted that file. Is it an error in the script or is there more to it than that? Can someone point me to a suitable tutorial or explain what to do?
I have just recently installed Fedora 14 Standard Edition. Upon after mounting another parition, I attpempt to find the new partition files in the directory I created for the mount point, but it is not showing. I know the mounted parition is there becuase upon "ls -l /iso", all the files show up.
I had previously tried out Fedora 14, XFE, and this problem did not occur. I also previously installed this same Standard Edition, and this problem did not occur. This is from the exact same download, however, the only difference is I installed Grub into a different partition. But now I am unable to find my new folders in Gnome.
Code: $ aptitude s sbin/ selinux/ server/ srv/ sys/ This is what happens when I hit TAB. It always showed "show" and "search" actions before... Why directories now?
I want to give a 404 error when the index.html file is requested, i already know how to do this in php, but i cant seem to locate any information about how to do it in htaccess. I thought about just redirecting index.html to a page that dose not exist, but i would like to do it correctly from the start.
I have created a sub directory on my box on a website for my company. It is a page that has links to my tools I want to use when I do service calls. Links that connect to my servers webmin etc. Of course I don't want them found by webcrawling bots. I have created a .htpasswd file using htpasswd -c /location/to/file/.htpasswd.
This file is located outside the web. Just under the public_html folder. Then I went to the sub directory I want to protect and added a text file named .htaccess. It contains:
I also opened the httpd.conf and changed AllowedOverride to All
The error document doesn't work either.
I then restarted the httpd service. I try to access the site and it lets me right in without asking for a password. It is apache 2xxx on Centos 4.5. Webmin under Apache confifirms all this.
I want to use rewite module to change my site url from site/index.php to site/cat/ for example I have created .htaccess file in the directory where the file is and add to it:
Code:
RewriteEngine on Rwerite ^cat$ /site/index.php
Here site is not my site, it just in example. I have rewrite.load in the loading list of modules and I can see it loads in phpinfo(). AllowOverride is set to All for the current site, but no redirect is made on site/cat
I have a blog on my site and am using htaccess rewrite rules to block all those nasty scripts from trying to execute various things mostly relating to phpmyadmin and wordpress. This has reduced my httpd error logs to less than half from before.
Am trying to come up with a rule to rewrite all calls to certain files if they are not originating from my domain, here is how it looks right now but it's not working as I can see scripts trying to hit "wp-comments-post.php" getting a 500 Internal Server Error.
I create and edit .htaccess file under /var/www/html/ directory, everything goes well as expected. The corresponding snippet of /etc/httpd/conf/httpd.conf:
I have searched high and low both on this website and the big G and nothing.I have a VPS with fresh CentOS 5.5 install and can't seem to get the server to act on the htaccess file.
My Pastebin for .HTACCESSIf you can offer any tips on improvements..but the main reason: I cannot get the bots to stop showing up.Esp the first one in the list.I need to block these two specifically
Code: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 &
I have a CentOS 5.5 Server and I'm hosting 3 different Name-based Virtual Hosts. Using Joomla 1.5 for the site. Each VHost has it's own subdirectory under /var/www so I have .var/www/vhost1, /var/www/vhost2 etc.
It works perfectly, but I'm getting a lot of unwelcome spiders and bad bots trawling for MP3 files because there are a few MP3 files on one the sites. I've added all the Bad Bots to my .htaccess file and now I need to know if I should make only ONE .htaccess and put it under /var/www OR create a different .htaccess file for each vhost (this would be an ideal solution, but I'm not sure if it can be done).
I have checked and double checked the .htaccess and httpd.conf to make sure everything is correct and I'm sure it is. I have AllowOverride set to All in the httpd.conf, Apache sees and is reading the .htaccess because I get a 500 Error when I put garbage text in the file. I'm lost now.
As many developers probably do, I have a Windows based machine on which I run XAMPP locally to test my code and a Linux machine with Fedora as my remote server.As I sometimes use .htaccess as a way to authenticate some parts of the website, I end up having two .htaccess files: one with the local path (something line D:My_Webs) and one with my remote path (something like /var/www/html/) to the password file.I have searched high and low, but I cannot seem to find any trick so that I only have to maintain one version of the .htaccess file which can work on both Linux and on Windows machines.
I am using Apache with Kerberos security enabled. The http page simply lists the directories contained in /var/www/html. I want to make only one of the directories in the document root secured so that when someone clicks on it, it requires him/her to enter credentials. Right now when I place the .htaccess file in the directory I want to protect, the directory is hidden from the list and the user has to enter the whole path to get authenticated and access the files.
I've tried Options +Indexes which was posted all over the net, but it didn't work.
I have a folder /home/dryaf/Desktop/site and I want to copy its content to the folder ~/www/sitename in the remote server. How to do that ? also does the scp copy also hidden files likes .htaccess?
I recently installed Ubuntu in my PC, without uninstalling windows coz it`s not really mine, its my father`s. When he saw the boot (like a month later, he REALLY uses the PC...) he went crazy and instantly ordered me to "uninstall that thing". Of course i`m not stupid to do that after a month of seeing how Ubuntu is... I want to know if/how can I make the pc start on windows auto, without prompting by showing the choose system screen, and when i open boot menu (i don`t remember, but i think its F-eight) i have both choices, so I can keep Linux without him knowing.
I need to redirect through a .htaccess file in my root folder. The redirect needs to be done from http://www.department.univeristy.edu/reuir to a different server [url]. I am having trouble in determining the pattern that is required for it to take effect.
In my nagios server, an htaccess file have been created for it so any time you want to open it, a window opens for you to enter user name and passwords. Now, what I want to do now is to integrate the nagios into a portal that is written in php so that when customers login to the portal, they can access the nagios with out it popping out the user and password windows. What is going to happen is, the password of nagios will be store in an orient database so the users when they try to access nagios true the portal, they will be login automatically.
I have a site hosted on a cheap hosting company and I need to allow write access to certain users in certain directories, sometimes on a file basis.
Q: how do I do that in a .htaccess file?
I have never used .htaccess although from what I read it looks straightforward but when I try it, I get a "500 server error" even with the example .htaccess file the hosting company suggested I use. (I have informed the host as they require and am awaiting their answer)
The site is automatically set up with a couple of dot file in the root directory when I got it, they are:
I have SLES10 SP3 with Apache2 in version 2.2.3 and php5.
I have a virtual host configuration with one vhost.
The /etc/apache2/listen.conf:
Code:
In _web-ims.conf:
Code:
This directory (/etc/apache2/conf.d) also contains the php-config-file php5.conf which now gets interpreted (I suppose):
Code:
Actually things are working OK.
However, ON THE SERVER php only gets interpreted by the FireFox browser when called upon localhost: [url] //Works fine.
When called thru the IP:
BUT when I call [url]
FROM ANOTHER CLIENT-COMPUTER, for example my windows computer with XP, using Internet Explorer, then it works!
So, does s.o. have any idea why the same call that does work on another client does not work on the Server? And why does the call with localhost on the server work?
Is that a bug in the FireFox-Browser or where do I have to tell him that he should interpret the php-code also from the IP and not only from loccalhost?
I am administoring a few websites and the latest website to be deployed has a few "unable to check htaccess file, ensure it is readable, referer: " errors in the apache-error log.
That is fine, becuase the directory doesnt contain a .htaccess file.
Is there a setting whereby I can turn off this error ?
I dont want to turn off checking for .htaccess files because some directorys have this file and use it.
I've got a dedicated server using whm/cpanel and there is a particular domain being hosted which contains a very elaborate .htaccess file which is full of mod_rewrite rules, among other things. I've been asked to try and get the contents of this .htaccess file into the httpd.conf (or some appropriate include file) to improve performance.
I've been reading this page and it's really confusing me: [url]
In the file usr/local/apache/conf/httpd.conf I see that there are two VirtualHost sections that appear to be relevant for my domain (which I'll call mydomain.com). The first listens on port 80 and the second is on 443. Seems to me that my apache directives in .htaccess would belong in this section. In the first section I see this:
Code:
In the second section I see this:
Code:
The problem with both of these is that the directory /usr/local/apache/conf/userdata does not exist, much less the full path to either of those files.
I have displayed web pages (e.g., index.html) under fedora 4, 5, 7, 8 before with no difficulty. Now I troed to do similar thing under Fedora 12, and I cannot display index.html. Is Fedora 12 special? I did '#tail -f /var/log/httpd/error-log' and noticed a line:
"SELinux policy enabled; httpd running as context system_u:system_r:httpd_t:s0"
Is the SELinux policy blocking me from displaying index.html page?
I have been working on this for the last couple of days, but I don't seem to be getting anywhere with it.
I have a Fedora 12 64bit Server set up, and for the most part it's working Great. I have 3 VHost set up, and no problems with that.
My problem is I just moved my Wife's xcart store from Godaddy to this server. And the store works great. But xcart is set up to use Clean URL's, and this is what I seem to be having all the problems with.
In my "httpd.conf" file, I have "AllowOverride All" to use ".htaccess" and in xcart ACP I have it set to use Clean URL's. I added a ".htaccess" file to root folder, with this in it.
Quote:
Now if you try to log in to the store with just the base url "www.my_site.com/" I get a the Apache Test page. But if I enter "www.my_site.com/index.php" the Home page opens, and all clean url's from there work fine. ALL other pages will open just fine. So clean URL's is working. Its just the home page that won't load right.
Now if I change the ".htaccess" file, and set "RewriteEmgine Off", then the home page will load like it should with the base url., but then "Clean URL's" no longer work. I have to change "AllowOverride All" to "AllowOverride None" and turn Clean URL's off in the ACP of xcart. to get the store working again.
Then the whole site works just fine, just No Clean URL's. I'm at my wits end with this, don't know what else to try to get this to work. So I was hoping someone here might have an answer, or at least some ideas to try. This all worked just fine on GoDaddy's site, I left because it was just too dam slow. It took a minute and a half just to load the Home Page. On my server its less that 2 seconds. And every time I asked them about it, it would speed up for a few days then slow down again. But the same .htaccess file worked on that server. So I think my problem is in my "httpd.conf" file just don't know where.
I've set up Apache once or twice in the past, but my memory is escaping me on something simple. This time the OS is Cent5 with Apache/2.2.3
When a user browses to: [url] <nothing else> I get as expected, a '403 'You don't have permission to access' because directory browsing is off for obvious reasons :-) If I enter the full URL to the script: [url]it works as expected. No issues there.
What I would *like* to do (and I'm sure I've done this before) is to set the cgi-bin up so if someone leaves off the script name: http://host.name/cgi-bin/ it serves "index.cgi" pretty much in the same way that if you browse to the root http directory, you index.html||php would normally serve. I just can't remember how to achieve this simple thing and I'm starting to wonder 'am I mad? - Did I do this before?'.
Sanity check - index.cgi exists and runs if you call it directly thus:
Code:
I have added:
Code:
That made no difference.
Tried:
Code:
For good measure, but that made no difference - and even
Code:
and no amount of playing with 'Options Indexes || -Indexes || +Indexes' helped me.
Like I say, I'm sure I've done it before (on Debian/Ubuntu) without the need to start playing with redirects or .htaccess - but I'll be blowed if I can remember how.