Fedora :: Keep On Downloading Tar.gz Files Into Downloads Folder
Sep 20, 2009
I keep on downloading tar.gz files into my downloads folder and i cant do anything with them. what i need to do to install the file so i can use it? An example, i am trying to install Frets on Fire, and am failing bad.
There is some issue with the latest version of LinuxDC++ (version: 1.2.0~pre1~bzr Core version: 0.75). I have upgraded to the latest version a few days back from launchpad repos due to the frequent crashes and system hang-ups. My complete and incomplete downloads folder are same. While my download was going on, LinuxDC++ deleted all the files from the downloads folder.
After about half an hour of tweaking, I updated my database using updatedb, and found my files in ~/.dc++/Filelists/anantwqqwe.BMIU2NFCFXB7ERTSG62PRSQPRJIN63A56EEGO6Q . What does that supposed to mean and why its doing this way? This is the second time this has happened to me, the last time I was unable to locate the deleted files on my system and I suppose they were not there. I use 64-bit Ubuntu 10.10(maverick).
All of my folder icons changed to the Downloads icon, even new folders have the wrong default.Anyone know how to fix this? The places icons I had to change manually
I understand wget is used to download files. Is there a way I can search a url for what files are available for me to download. I need to install a plug-in from an adobe website.
If I download a .deb file then when I run dpkg all files install in the correct directories.But with a tar.bz2 file when I run tar xvjf it installs in its current directory. How can I make a tar.bz2 file install into the appropriate directory?
I have been unable to open my Downloads folder for 2 days now. I've tried everything. I can get into every other folder on my machine though. I also have 103GB free space on my machine. Is there perhaps a limit on the amount of data I can have per folder? I'm running 64bit Ubuntu.
I am vijaya, glad to meet you all via this forum and my question is I set a crontab for automatic downloading of files from internet by using wget but when I kept it for execution several process are running for the same at the back ground. My concern is to get only one copy, not many copies of the same file and not abled to find out where it's actually downloading.
I am used to Ubuntus simple sharing with samba. Just install it, reboot and then share the files.Then do I klick on network folder and see all the shared files on the computers in the network.
How do I install it so I only need to go into network folder and see the other computers shared files.Then, how do I share files?
I hope it's not so difficult and that I have to change i config-files.
I have Debian and want to be able to connect to an ftp server, download some (or all) files, disconnect from this server, connect to another ftp server and upload everything on it. (And delete the temporary files on my PC). This should be done form the command line. I am no expert in linux (although I am acustomed with it). How to do this (or part of the solution ). In the end I would like to write a script, that mirrors my site from 1 place to another.
I am using ubuntu 8.04.When I install software from internet then files are downloaded to /var/cache/apt/archives so i keep those debian files in safe place so that I can install that software on another stand-alone computer.
MY question: When we run 'apt-get update' for first time after fresh install it downloads some files. Can i store those files & point to them for a networkless computer which has no internet ? If this is possible it will allow me to 'apt-get check' on stand-alone computer to see if any package is not proper.
When I right-click on a file in my firefox downloads window and choose "open containing folder", it opens the folder in EasyTag. I tried both of the below "solutions" and logged out and logged back in to openSUSE and it still uses EasyTag! Is it because I need a restart or is it something else?
Neither one has worked yet... [URL] Open Containing Folder in Firefox under Linux
So I recently installed Lubuntu on my netbook and I have it all working except for one hitch - when I download something in Chromium (such as a .deb or a .odt file) when I click on the finished download in the bar at the bottom of Chromium it does not open the file in gdebi/abiword as it should. Also when I right click on a file and select "show in folder" it does not work...
Not sure if this is an issue with Chromium or LXDE/PCMan but I figured it couldn't hurt to post here asking about it.
Losing my car keys is something I have learned to live with, but now my Downloads folder has vanished. I'm sure I left it in the usual place, but it's nowhere to be found.
how to reinstate it. I assume it's not as easy as just creating a new folder in my Home folder called "Downloads", is it ?
Basically, this command goes to URL, downloads file1.txt and file2.txt, however it saves BOTH files as newfilename1.txt. I would like the script to name the second download (file2.txt) newfilename2.txt. So, before you say to use the -O switch in Curl, please understand that I wish to rename the files so that they are not what they were on the server (names are too long). So file1.txt becomes newfilename1.txt, file2.txt becomes newfilename2.txt. Is this possible? The command I listed works only until the newfilename{1,2}.txt, it always saves as newfilename1.txt
In order to download files from a particular website, I have to include a header containing the text of a cookie, to indicate who I am and that I am properly logged in. So the wget command ends up looking something like:Code:wget --header "Cookie: user=stringofgibbrish" http://url.domain.com/content/porn.zipNow, this does work in the sense that the command does download a file of the right size that has the expected name. But the file does not contain what it should--the .zip files cannot be unzipped, the movies can not be played, etc Do I need some additional option, like the "binary" mode in the old FTP protocols?I tried installing gwget; it is easier to use, but has no way to include the --header stuff, so the downloads never happen in the first place
I'm having a weird problem with GNOME. Inside a folder I have several files and when i open the folder graphically i can't see some of the files. If i use my terminal i can see those files. It's very strange!! I've rebooted my computer and the problem is still there (inside the same folder)
I am having problems seeing files and folder names using Nautilus, but they are there as I can access them using CLI commands. Is there a way to get Nautilus to update its database or whatever it uses? I am using Nautilus 2.32.2.1. As is shown in my signature I am using F14 and Gnome 2.32.0.
I'm having a very strange problem with my ubuntu apache2 server running wordpress. i want do download media files (from within a flash-mp3-player onsite or by link [url]) but the file transfer just stops after a while. (at least sometimes) at random positions. after that i have to clear the browsers cache and try again.
It is really annoying, though it is my band's website and we want to share our songs with our friends. i checked from several clients, seems to happen everywhere (linux, mac or windows clients)
In Linux bash shell, for a given directory, how can I list:The create date for that directory The number of files in that directory The number of subdirectories in that directory.
when i used windows there was this wonderful editor named Notepad++.it was perfect(it still is) some of its best and useful features of it (for me) was:
1-open all files in a folder when drag and drop the folder on it 2-search and replace a statement in all open files 3-have an extended mode which include special characters like
and so on.. i want to know if there is an editor with this feature in ubuntu?
Yahoo! is shutting down Geocities and I need to download all the files in my webfolder there, is there a program that will download all the files there automatically
I've using RedHat/Fedora for years now, and every now and then I encounter the following situation :
I open a folder and it's empty. The folder was containing files and I'm 100% sure I didn't deleted them myself. Each time the folder is deep inside the hierarchy and is among other untouched folders. Sometimes it's a folder I never use, sometimes it's a folder I use almost everyday. The missing content is not large (a few regular files).
I'm currently running F13 but I've seen this behavior before on previous versions. This is kind of scary all my work is there and my backups are also done on a a linux backup server.
I'm puzzled, I cannot see any specificities to these folders, I had no crash or cold reboot, nothing I see can explain that. Could it be related to ext3?
I had Fedora 7 and Windows-XP dual operating system. Few weeks ago, there was GRUB error. Now I want to install Fedora 14, but before it I want to back up all data in one external hard drive. I used Fedora 13 live-CD I could access all the drive formatted as NTFS, but I couldn't access /home and the drive formatted (ext2) and owned by the user-name.
Please, inform if there is any idea to copy my files that remains in the home folder. (I can see them but copying is prevented)