Server :: Stop The Viewing Of Robots.txt (directories)
Jul 7, 2010
I need to stop the viewing of robots.txt on my website. I get the contents of the file displayed in my browser when I issue the command: [URL] stop this as it displays all the directories I don't want them to go to.
The main problem is that I can look at any directory on my site and get a file listing and then right-click on that file name and then save it to my client hard-drive all from my browser. ex: [URL] I think I can change this behavior in the apache config but don't know enough.
View 3 Replies
ADVERTISEMENT
Jul 7, 2010
I need to stop the viewing of robots.txt on my website. I get the contents of the file displayed in my browser when I issue the command: [URL] stop this as it displays all the directories I don't want them to go to. I am right at the level of Linux knowledge to be dangerous, capable but still dangerous.
View 5 Replies
View Related
Oct 26, 2010
I backed up my Laptop with a script, as follows:
Code:
#! /bin/bash
sudo
growisofs -Z /dev/dvd -dvd-compat -r -v /home
I then installed a new version of Ubuntu 10.04 from disk and copied the files in /home from the cd to the hard. I am able to open, view etc. all the files in most directories except those in /home/documents. There are text files created by gedit, OOWP and several PDF files. I cannot open or view these files, depending: gedit and pdf files gets a Err.Msg. "Don't recognize file type" (it is clearly marked PDF) . The OO files look like rows of 'high bits' and a dialogue box opens giving me the options to change Char. Set, Font, Language, Paragraph break.
View 6 Replies
View Related
Jan 12, 2010
im running vps with debian lenny, with apache2 webserver, memcached, postgres and django-framework using mod-wsgi. Some of the pages served by django take long to generate (aprox 15 s), but there is also the memcached, compensating this.
My problem is that, when a robot visits the site, it starts traversing the site visiting all the pages and also the nongenerated, thus slowing it down to point where it is not responding.
What Im looking for is a solution to identify that the request comes from a robot (user-agent, ips, etc) and limit the resources, so that f.e. only one thread serves the robot etc...
View 2 Replies
View Related
Jul 7, 2011
The directories
/home/<user_name>/.Trash
and
/root/.Trash
do not exist.
I've tried viewing hidden directories with nautilus and using "cd" in terminal. "locate" is equally useless.
View 4 Replies
View Related
Oct 11, 2010
I'm using the lego RCX2 to create robots for a class, and I can't figure out how to get the tower to work with Ubuntu
View 4 Replies
View Related
Jun 17, 2010
I use the following command to find under /var some param in my script
grep -R "param" /var/* 2>/dev/null |grep -wq "param"
my problem is that: after grep find the param in file grep continue to search until all searches under /var/* will completed
How to perform stop immediately after grep match the param word
For example when I run the: grep -R "param" /var/* 2>/dev/null |grep -wq "param"
grep find the param after one second.
But grep continue to Search other same param on other files and its take almost 30 seconds
How to stop the grep immediately after param match?
View 1 Replies
View Related
Jan 5, 2011
Suppose when I install squid proxy server Can I view the web pages visited by a particular user/machine in a particular session? I think we can analise the information by the log files of squid. But can I view the page(static)?
View 6 Replies
View Related
Nov 19, 2010
In my logs for Apache I have lots and lots of failed attempts for incorrect incarnations of [URL]. None of them are anywhere near my alias for the index.php but yet phpmyadmin is broken. Is there away I can mess up robots like this. Send IP's that create multiple wrong page requests on my server back to their own IP address maybe? I would then just set thresholds to decide how strict to be. I did try fail-to-ban before but it is cryptic. I don't have it on this particular server.
View 2 Replies
View Related
Apr 25, 2011
I like AWstats for viewing/visualizing web traffic to my server. However, I am in a situation where I would like to visualize ALL traffic to/from a network, in a way similar to AWstats. That is to say, Free and Visual. The program will have to be linux/http/java based as it will run on a linux server off of a Network Tap.
View 5 Replies
View Related
Jul 23, 2010
I've installed qmailtoaster on CentOS. I use vqadmin for managing virtual domain. I can list the current domains on vqadmin interface. But when I choose "View Domain", enter a my domain and choose "View Domain", I receive a white page. I can't view the information about my domain. Please help me to resolve the problem. I can create a new domain and view the domain on vqadmin but I can't view the existing domains.
View 1 Replies
View Related
Jun 9, 2011
I'm trying to rsync files and directories from a RedHat linux host(v 4.5 & 4.7) to a Windows server 2003R2 Standard Edition with cygwin running. I'm executing the rsync command from the cygwin shell. The transfer involves rsync'ing approximately 1 TB of data from the linux server to the windows server. After about 280+GB of data transfer, the transfer just dies.
There seems to be no particular file or directory that the transfer stops at. I'm able to rsync GB's of data from other linux hosts to this cygwin server with no problem. Files and directories rsync fine.The network infrastructure is essentially the same regardless of the server being rsync'ed in that it is GB Ethernet running through Cisco GB switches. There appear to be no glitches or hiccups across the network path.
I've asked the folks at rsync.samba.org if they know of any problems or issues. Their response has been neutral in that if the version of rsync that cygwin has ported is within standards then there is no rsync reason this problem should happen.I've asked the cygwin support site if they know of any issues and they have yet to reply. So, my question is whether the version of rsync that is ported to cygwin is standard. If so, is there any reason cygwin & rsync keep failing like this?
I've asked the local rsync on linux guru's and they can't see any reason this should fail from a linux perspective. Apparently I am our company cygwin knowledge base by default.
View 3 Replies
View Related
Sep 29, 2009
My current setup is:
old server:
www.mydomain.com main site
www.mydomain.com/subdirectories related sites from same server, different directories
I am adding an additional server that I would like to initially only use for the main site, something like this:
new server
www.mydomain.com main site
www.mydomain.com/subdirectories would be pointed back to the old server instead
What's the best way to redirect the traffic for the sites found in sub-directories on the old server?
View 2 Replies
View Related
Dec 19, 2009
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
View 14 Replies
View Related
May 15, 2011
I want to make a webserver with multiple users allowed to login through SFTP to a specific folder, www.Multiple users are added, lets say user1 and user2, and all of them belonging to the www-data group. The www directory has an owner www-data and a group www-data.
I have used chmod -R 775 on the www folder, but after I try to create a folder test through my SFTP server (using Filezilla) the group of the directory created has only r and x permissions, and I am not able to log in with the second user user2 and create a directory within www/test due to a lack of w permission to the group.
I also tried using chmod 2775 on www directory, but without luck. Can somebody explain to me, how can I make it so that a newly created directory inherits the root directory group permissions?
View 2 Replies
View Related
Aug 7, 2010
If I share directories with NFS, how do I control the access of the users to the information?
View 1 Replies
View Related
Oct 4, 2010
I'm planning a NFS share for a small enterprise (25 NFS clients). I need to create a directory structure but I'll need to set up differents permissions (rw/ro) to some directories of the tree. I wonder if it's possible to grant access using groups IDs, so that would be ideal for this application. Is it possible? I was thinking that I would kneed some kind of centralized user info, such as NIS or LDAP. Is that necessary?
View 4 Replies
View Related
Jan 4, 2010
I am connecting servers using NFS4 the shared directories are on servers running Debian 4 while the one who read from them is Debian 5.0.3. The problem is one of these shared servers suddenly stop responding and you cannot list it from Debian 5 server, also df hang, and the web application that is using it does not respond to requests that use this shared directory since it is blocked. Then the load on the server start to increase until the server cannot respond (over 90). I have found many entries in the syslog that refer to this like:
ma25555 kernel: [1200285.732919] nfs: server 10.xxx.xxx.xxx not responding, still trying
Dec 31 08:16:33 ma25555 kernel: [1200289.815378] INFO: task java:9702 blocked for more than 120 seconds.
Dec 31 08:16:33 ma25555 kernel: [1200289.835249] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
code....
I have tried the connection between the 2 servers using ping for one day and all are OK (zero lost)
There are 3 other servers that are running Debian 4 and are working fine.
View 1 Replies
View Related
Jul 25, 2011
Because our visitors/customers are short term, and may be configured incorrectly with their own mail servers we automagically redirect all port 25 traffic going to internal IP's to our own mail servers while on our network.(postfix on centos 5.6)While I have taken some measures to prevent it from spamming, I would greatly appreciate some assistance.I will be putting in clamav, but I haven't configured it yet with the mail.I am using postfix, but can also put on procmail or even spam assassin
View 6 Replies
View Related
Feb 1, 2010
Our mail server keeps on hanging after a while. This happened after there was a breakdown in electricity supply and the server room air conditioners stopped working for almost half a day. We are not sure whether this is a server hardware problem or a coincidence with a break-in attempt or malware activity.
Following are the message from the server log code...
View 14 Replies
View Related
Jan 25, 2010
I'm having trouble setting up a vsftp server correctly. What I want to do is allow a number of users to log on (no anonymous user) and each of them to be taken to their own "top level directory" from which they can not escape.
I've got most of this working, but I can't find a way to automatically transfer each user to *their* working area. The "local_root" directive doesn't quite do what I want as everybody has to share the same working area (potentially users could interfere with each other). On the other hand I don't want each user to work from their home directory because there are loads of special files there that I don't want users playing with.
To add one extra compilation, I'm also running an html server on the same machine. One of the directories the html server can see is one of the ftp area root directories (So what I'm trying to do is give one special user ability to ftp files onto the html server. Other users must *NOT* have this ability)
View 6 Replies
View Related
Aug 9, 2010
I would like to copy all the directories (including data) from the Linux box to an external hard drive.
View 1 Replies
View Related
Mar 23, 2011
I am writing a script, in that my requirement is, if all the fill types stored in one directory from that we need to separate different different directories based on the file types.
for example in a directory(anish). 5 different types files
1- directory
2- .txt files
2- .sh files
like that and my requirement is the (1- directory is moved to one new directory(dir) which we are given in the script)and (2 .txt files are moved to another new directory(test) which we are given in the script)and ( 2 .sh files are moved to another new directory(bash) which we are given in the scrip)finally the directory anish should be empty..using bash script.how it is possible !!
View 7 Replies
View Related
May 21, 2010
I'm running an Ubuntu 9.10 Linux server. I'm trying to find a way to backup the machine while it is running and from what I see, this eliminates the disk clone utilities. All of the disk clone stuff I have seen for Linux requires that you reboot into a special live CD.So my question is this, what is the best solution for backing up the system while it is running? Also, I don't really care about the OS config too much, I just want to be able to keep my stored files and my programs that I have installed on it.
View 3 Replies
View Related
May 18, 2010
have had a server running for a very long time using Ubuntu Server 7.10, and I think it's passed time that I upgraded.I'll be installing fresh, and I've already backed up /var/www (as well as a home directory with a few files)I've only used this as a Web / SFTP / file server. Might there be any other directories that would be good to backup? I set it up so long ago and have made a few changes along the way.
View 1 Replies
View Related
Jul 9, 2011
I want to get all the directories from a remote server using ftp. I know how to use mget for files, I would like to know if there is a similar way to get the whole directory with the files included obviously.
View 1 Replies
View Related
Feb 18, 2011
I am setting up an SVN server (svn+ssh) that will be used by students at the university where I work. I was considering in the beginning, one single repository and eventually creating directories for each project inside the repository. It seems to me now, that it is not very secure way of doing things. The directory on the server will be with rights 770 and this means that every student can come on the server and sweep out the whole repository.
Also mistakenly or not, every student can 'svn delete' the whole repository, which could be a nightmare to recover from. An issue might be to create groups and then assign users to groups and then create many repositories and each repository to be assigned with group. This means that I will have to manage tens or hundreds of repositories -- maybe not very common task. What is an optimal solution for this working environment.
View 5 Replies
View Related
Sep 14, 2010
Can files/directories have Greek characters? If I selected English when installed Linux?
View 3 Replies
View Related
Jun 27, 2011
To Protect Web Page Directories With Passwords i have done the below configuration but the problem is when i click the linux it is not asking username and password,
Created new account for logging into web interface:
htpasswd -c /etc/httpd/conf/.htpasswd travelkarega
Created a file name .htaccess in /opt/apps/deploy/websites/travelkarega/html/
vi .htaccess
AuthUserFile /etc/httpd/conf/.htpasswd
AuthName "Please enter password"
AuthType Basic
<Limit GET POST>
require user travelkarega
</Limit>
Added these above entries in the file .htaccess
View 1 Replies
View Related
Mar 5, 2010
For some reason, my user directories don't seem to process php files. For example, server/~reduxtion/index.php forces the browser to download the file while server/index.php is fine.
View 1 Replies
View Related