I'm quite lost and not even 100% on which forums i should aim at..I have a relatively simple task yet for some reason its proving difficult to do!! Our server is running Fedora (running a live commerce site) recently we had some security updates applied so i can no longer connect remotely via SSH root. Using PUTTY i now connect under a different username and then SU under the root user/pass. All is fine, now all i want to do is download a file in the tmp2 directory. /tmp2/apache2-gdb-dump Can anyone tell me1) How would i download this file using Putty/SSH command 2) I'd much rather user a GUI tool for this kind of work but the substitute user step doesnt seem to be supported by common apps,as FileZilla.With this in mind, is there some software or steps i can take so i can connect, run a su command and use a nice gui to transfer these files
I'm having trouble understanding how to verify the download of the Fedora iso-files. know how to do this on a Windows system. I have been looking in the help section for checking the iso-files, but I'm not sure where to find the right hashes, like MD5, SHA1, and etc.
when I try to view php files on my linux box, they want to download instead of viewing them. I configured apache for php as the manual said but for some reason it doesn't want to parse the php. the http.conf file may need to be changed, that the line "AddModule mod_php4.c" was missing in the conf, however the AddModule and ClearModuleList directives no longer exist in the newer versions of Apache. These directives were used to ensure that modules could be enabled in the correct order. The new Apache 2.0 API allows modules to explicitly specify their ordering, eliminating the need for these directives.
I haved tried 3 times to download DVD-7 from http://cdimage.debian.org/debian-cd/...md64/jigdo-dvd, and every time it has failed with just 5 files left to download.
It says: I cannot begin to describe. All those hours of downloading for nothing! What the heck is happening here? When I try to just continue on, I get error code 3 aborts and have to just start all over.
I'd like to configure vsftpd server in a way to allow remote user (local) too see and edit configuration files in their ftp directory starting from dot (like .htaccess, for example). With default configuration + "local_allowed = yes" it does not appear to be possible:user can successfully upload .file but could neither see if it is in directory nor download it.
I was happily running F10 and against my better judgement when it offered to upgrade me to F11 I decided to give it a try. The F11 install hung at 893 of 1626 packages installed. Some SE Linux package was in process of install. On the next boot the installer fails to upgrade for a corrupted root. It says I can backtrack and do a full install, but crashes with a bug when I choose backtrack.
So here is my question - Can i edit the command line and tell the installer to full install rather than upgrade? Or am I stuck with downloading a DVD ISO and doing a full install that way. I've done F8, F9 and F10 so full install from DVD doesn't bother me.
I have a low powered headless box (DLINK DNS323) running linux that I keep in a back room at home to handle large file downloads. It does not have X installed. Everything is handled at the command prompt through an SSH link. In the most common case, I log in from a remote location (perhaps a coffee shop), start the download, then disconnect and go about my business.
If I try to download from a free account of rapidshare, filesonic, or some another file service there is a manual handshake process (decode a captcha, wait for 60 seconds, etc) that requires a graphical client to complete.
I would like to somehow navigate through the handshake using my laptop (running Firefox) (perhaps through a proxy on the DNS323), but direct the download directly to the DNS323 (i.e., not routed through the laptop) so that I can disconnect and expect that the download will complete without further involvement from me or my laptop.
Is there anyway to direct a download to a remote path? Any other suggestions for solving the problem?
This is our first time choosing and installing linux. Our other servers are all windows 2008 x64. We were told to install fedora 13. I can only find a download for the desktop version and we're looking for the SERVER x64 download. Could I please get a link?
I have a laptop that's not up to par with opening 1080p video. I do however have a PC that can. I've connected my laptop to my TV as a media center. I can remotely access files from the PC to play but because the laptop's resources are being used to play it I still encounter some slowdown. Is there any way to remote access the other PC while using its resources to play the video?
I have a computer set up running an apache httpd server with the basic LAMP functionality. I logged into the server today and noticed numerous attempts by a remote IP to access various files on my server.
Code: [Fri Nov 26 07:37:56 2010] [error] [client 220.127.116.11] File does not exist: /var/www/html/phpMyAdmin-2.8.1-rc1 [Fri Nov 26 07:37:56 2010] [error] [client 18.104.22.168] File does not exist: /var/www/html/phpMyAdmin-2.8.1 [Fri Nov 26 07:37:57 2010] [error] [client 22.214.171.124] File does not exist: /var/www/html/phpMyAdmin-2.8.2 [Fri Nov 26 07:37:57 2010] [error] [client 126.96.36.199] File does not exist: /var/www/html/sqlmanager
There are many many more lines similar to this but its obvious that a remote host was attempting to run my phpmyadmin page and gain access. My question is that since I'm a novice at running httpd what sort of things should I be looking out for in regards to security and configuration to prevent any attacks or is this common and should just be ignored? I do run phpMyAdmin on the server and have configured MySQL to not have blank passwords or user accounts. The database had no new changes to it so I doubt they actually accessed it.
In Dolphin i have build an ssh-Connection to my homeserver. Now when I edit an file, the save to the remote location occures only when the editor is closing.This is for webdevelopment etc. not practical.Which alternatives are possible?(under Windows i love WinSCP )
I would like to copy several files from a remote machine. This archives are contained in different folders and their name have a commun caractheristic (also the folders).I have tried something like that:
ftp open machine@ prompt %to get into the non interactive mode
i was browsing one of my friend hard drive using knoppix live CD, i was amazed to find that all the folder which he uses was empty, there was no files present in them, for example there was a folder in /usr/local named web, but when i browsed that folder using knoppix it was empty.I searched for files in every partition but still no result found.
After some time when i placed hard disk back and booted the PC normally, everything was in its proper place. Then i thought to make image of the hard drive and use it on my PC, the image booted well, but still those particular files were missing. I want to know how is that possible? Is there any way to get files from the remote system during bootup?
I have limited access to several servers (key based auth) but cron facility is not available for me. Those servers getting filled up by large apache logs and I have to login to each node manually and clean them each and every day.
I tried to write a script to run from login box but when i try that it looks like it is looking for logs in the local server (login box).
So current situation is:
How can i modify this so that the script in server1 will look for files in that server and zip them?
Google showed another command called rsh but in my env it is also not avil.
I'm trying to download a set of files with wget, and I only want the files and paths "downwards" from a URL, that is, no other files or paths. Here is the comand I have been using
Code: wget -r -np --directory-prefix=Publisher http://xuups.googlecode.com/svn/trunk/modules/publisher There is a local path called 'Publisher'. The wget works okay, downloads all the files I need into the /Publisher path, and then it starts loading files from other paths. If you see [URL]..svn/trunk/modules/publisher , I only want those files, plus the paths and files beneath that URL.
When we download ubuntu iso images from the download mirrors, md5sums, sha1sums and sha256sums are provided for the download. But, there are also some .gpg files provided. Are they used to verify some signature or something? How do I use them to verify the signatures of the packages?