General :: Large Directory With Wget With Two Links Pointing At Same Thing
Mar 19, 2011
I'm trying to crawl a directory on a website and basically download everything in it. The structure is simple enough(but there are also multiple folders), but there is one thing that makes wget choke up.Both of the links work, but they are both the same thing. So wget will download the same file twice. How can I make wget ignore the first one? but this doesn't seem to actually do anything. It will still download the duplicate URLs
View 1 Replies
ADVERTISEMENT
Jun 29, 2010
I'm trying to download two sites for inclusion on a CD:URL...The problem I'm having is that these are both wikis. So when downloading with e.g.:wget -r -k -np -nv -R jpg,jpeg, gif,png, tif URL..Does somebody know a way to get around this?
View 2 Replies
View Related
Jun 16, 2010
For some reason, my Slackware 13.0 system has multiple problems.
When I do ldd /usr/bin/X | grep libpixman
I show a libpixman which is NOT in /usr/lib.
View 1 Replies
View Related
May 5, 2010
The *.dbf files (DBase III Plus files) have a header (metadata) and follow with n fixed records. I'd like to make a directory entry (like a symbolic link) that point to the fixed record area into a .dbf file. Is it possible in linux? The request is motivated by access a .dbf file from a Firebird SQL Database using CREATE TABLE EXTERNAL FILE '/tmp/mydbf.dbf' ( ... ); but this command only works on fixed records.
View 1 Replies
View Related
Feb 20, 2011
I can see some soft links in /etc directory which are pointing to /etc/rc.d Directory contents.
Code:
lrwxrwxrwx. 1 root root 7 Jan 31 08:19 rc -> rc.d/rc
lrwxrwxrwx. 1 root root 10 Jan 31 08:19 rc0.d -> rc.d/rc0.d
lrwxrwxrwx. 1 root root 10 Jan 31 08:19 rc1.d -> rc.d/rc1.d
code....
Any body please tell me what is the purpose of these soft links in /etc directory ? I am using RHEL 5.4 ...
View 3 Replies
View Related
Apr 25, 2010
I've looked around the other threads as well as the wget man page. I also Googled for some examples. I still cannot work it out. From the page [URL] I want to download the 48 linked files and their corresponding information page.To do this (the first file) by hand I click on the line that saysApplications (5) Go to the first optionDell - Application Open and copy the linked pageApplies to: Driver Reset Tool Then back on the first page click on the Download button. On the window that opens up I choose to save the file.
Then I move on to the next option (which is Sonic Solutions - Applications) and repeat this until I would have all my files. I do not want to download the many other links on this page. Just the above mentioned, so I can take it back to my internet-less place and refer to it as if I was on the net. I am using the 9.10 LiveCD at my friends place.
View 2 Replies
View Related
Aug 8, 2011
[URL]..The download button's link cannot be opened in a new tab, what to do?
View 5 Replies
View Related
Jul 2, 2010
I'm trying to download all the data under this directory, using wget: [URL] I would like to achieve this using wget, and from what I've read it should be possible using the --recursive flag. Unfortunately, I've had no luck so far. The only files that get downloaded are robots.txt and index.html (which doesn't actually exist on the server), but wget does not follow any of the links on the directory list. The code I've been using is: Code: wget -r *ttp://gd2.mlb.***/components/game/mlb/year_2010/
View 4 Replies
View Related
Sep 21, 2010
I am new to linux and wget...what would be the syntax to use wget to move content from one local directory to an svn repository (svn commit)? For instance if i have a directory c:\dir1, and i want to move it's content onto an SVN repo...is this possible using wget? If so, how do I get this done?
View 2 Replies
View Related
Oct 16, 2010
am having to reinstall 10.10 and putting on it's own drive. Even though I can't get my system to boot properly, my old home directory is still intact on a different drive. How can I get the new install to point at the old home directory? I have read the tutorials, but it just isn't clicking for me.
View 9 Replies
View Related
Jul 6, 2011
What is the Wget command to perform the following:
download only html from the url and save it in a directory
other file extentions like.doc,.xls etc should be excluded automatically
View 4 Replies
View Related
May 4, 2010
Can you please tell me how can I tar ball/compress a directory hierarchy with soft links in Linux?
View 2 Replies
View Related
Aug 1, 2010
I am trying to create hard links within a file system to a directory, but unable to do that. is there any limitation to create hard links to directories within file system ?
View 4 Replies
View Related
Feb 10, 2010
I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like
for file in ls *; do
cp {source} to {destination}
done
then because of ls command , its performance degrades.How can I do this?
View 7 Replies
View Related
Aug 3, 2011
I have folder stucture:
|- dir1/
| |- sub1/
|
|- dir2/
|- sub1link -> /dir1/sub1/
and my current working directory is sub1link, is there a quick way to either: change directory to link source parent (i.e something similar to cd .. but take the user to /dir1/ change directory to link source (i.e switch from /dir2/sub1link/ straight to /dir1/sub1
View 1 Replies
View Related
May 19, 2010
I am currently trying to copy a directory of roughly 400GBs to dvd, have gotten myself stuck. I tried to tar and then split; however, I don't have enough room on my hard-drive to make a compressed tar and split it up and then burn to disk, so I need a way to tar the and compress the directory, split it, and burn to disk every 4.3GBs.
I went ahead and installed DAR as an alternative, as I hear it is designed for this type of task, but I can't figure out which way is heads or tails.
my OS is the newest version of ubuntu 10.
View 5 Replies
View Related
Apr 4, 2010
I m having a RHEL-5 sever.ABC directory size is 57GB after taking backup in the same disk with name ABC.bkp showing 56GB. i used below command to copy/backup. # cp -r ABC ABC.bkp (different sizes after copying)..I checked both the directory sizes by #du -sh <ABC> and du -ks <ABC.bkp>In both GB and KB there is lots of difference (200mb). why this will happen in copying? what is the solution for above question? what is the correct way of copying 1dir to newdir exactly?
View 4 Replies
View Related
Feb 3, 2011
We have 2 servers, 1 is the webserver and the other is the Mysql server.
When transfering a 2GB file from the webserver to the Mysql server.
The webserver's connection to the mysql DB server dies completely.
Need to restart the MYSQL process in order for it to come back online.
During this connection downtime, when using phpmyadmin on the mysql server shows no problem running queries etc.
View 2 Replies
View Related
Jan 22, 2010
I have uShare 1.1a setup to talk to my XBox 360. If I share a directory that has no subdirectories, the video files display on the XBox. However, most of my files are in sub-directories on a different partition - I don't really want to copy them to the share, but uShare doesn't seem to recognise any sub-directories or files contained therein.
I have tried setting up symbolic soft links directly to the video files (although this is a pain, it is better than moving the files)...
Code:
ln -s /home/jonftp/TV-Shows/Buffy/Season-1/Buffy-101.avi /home/share/Buffy-101.avi
...but these don't show up on the XBox either.
How can I get uShare to "drill down" the directory structure to list the files or how can I get uShare to follow symbolic links?
View 2 Replies
View Related
Jul 27, 2010
I am trying to download site using wget :$sudo wget -r -Nc -mk [URL] but it is downloading the contents of all directories and subdirectories under the domain :[URL] (ignoring the 'codejam' directory) so it is downloading from links like : [URL]... i want to restrict the download so that wget command should download only the things under 'codejam' directory
View 9 Replies
View Related
Oct 3, 2010
Code:
The lines highlighted in red are symbolic links to the boot files. Yet, they are not used and, if deleted, the system still works.
Anyone know why they are there? Is it a leftover from Linux days gone by, or does SOMETHING use them?
View 2 Replies
View Related
Jan 15, 2010
I have a Music folder and I need to create hard links for all files in Music directory.For example:
~/Music/01 - the song.ogg // hard link for this file
~/Music/Folder/02 - a song.ogg //and for this file TOO!
~/Other Music/01 - the song.ogg
~/Other Music/Other foldr/02 - a song.ogg
I want hard link files in folder and in subfolders, but not folders its self.
View 6 Replies
View Related
Mar 12, 2010
i'm using wget with this parameters:
wget -E -H -k -K -p -nH -nd -Pfolder http:\www.mysite.com
using this parameters the main html page and all the images will download in the same folder. Instead i would like to have the html page in a folder and all the images,css ecc in a subdirectory for example i want to have:
c:foldermain.html
c:foldersubfolderimage.jpg
c:foldersubfolderimage2.jpg
c:foldersubfoldercss.css
It's the same that mozilla firefox do when i save a html page on local machine ("save page as" on file menu) Which parameters do i have to use?
View 1 Replies
View Related
Jan 21, 2010
If I wanted to back up a large directory (13 GB) to DVD, what would be the best way to do this? Basically, what is the easiest way to make an archive that is split into volumes small enough to burn to disc?
View 3 Replies
View Related
Oct 9, 2010
This thread was nearly titled "The volume Filesysyem root has only 128 KB free space remaining" then I discovered the cause my Encrypted Private Directory had grown to 20GB eating all the free space on my Ubuntu system partition. Here's what happened:All was well with my system last night, left it downloading 2 GB of files from the internet to an NTFS drive to return to low space errors this morning.I checked and nothing had been downloaded to my Ubuntu partition, and even if it had, it could of handled the 2GB without issue. Did some reading on here and the first step I tried found the problem:
Code:
mark@media:~$ df -h
Filesystem Size Used Avail Use% Mounted on
[code]....
View 2 Replies
View Related
Mar 6, 2009
I'm using FC10 and I want to create a symlink to my movies directory in my home folder:
This is what I did:
I created in
/var/www/html
ln -s /home/username/movies movies
Then in /etc/httpd/conf/httpd.conf
DocumentRoot "/var/www/html"
<Directory />
Options FollowSymLinks
AllowOverride None
</Directory>
<Directory "/var/www/html">
Options Indexes FollowSymLinks
AllowOverride None
Order allow,deny
Allow from all
</Directory>
<Directory "/home/username/movies">
Options Indexes FollowSymLinks
Order allow,deny
Allow from all
</Directory>
Restart apache and then the test page is working.
The directory /home/username/movies has following permissions:
drwxrwxrwx 2 apache apache 4096 2009-03-05 23:43 movies
When trying to access my webpage at localhost/movies I get the 403 Forbidden Error.
Ok then, entering:
sudo -u apache ls /var/www/html
> movies
This works, sudo -u /var/www/html/movies returns the permission denied error.
As well sudo -u /home/username/movies
Is the user apache chrooted by default? SELinux is in permissive mode. What can I do?
View 4 Replies
View Related
Oct 18, 2010
I'm having a bit of an issue with Lucid installed via Wubi. I stuck the OS on its own partition (30 GB in size), and don't store any large files in the Ubuntu file system (when I download something large I move it to another hard drive.) I don't have anything wacky or esoteric installed on my system.
I've been consistently having a problem where, after a few hours or a few days of being booted up, Ubuntu warns me that my available HD space is dangerously small. The amount of available HD space Ubuntu sees then shrinks from a few GB to nothing within a few minutes, and the only way I can seem to solve this is to reboot. Taking a closer look at what's happening, my Home folder balloons in size until there's no more writable space recognized. But there are no files being created or added to, so it looks like there's a bug of some sort. This SEEMS to be correlated with watching videos (or maybe it's the pulling of large files from a mounted directory into RAM? My videos are all on another HD, as mentioned before). I can generally go a few days without getting the "low space" message, but I can't seem to make it through a full 2-hour movie without getting the error.
View 3 Replies
View Related
Jul 17, 2011
I am trying to make my Apache server show symbolic links in a directory listing, but have so far been unsuccessful. In my latest attempt, I have placed the following code in .htaccess, in the directory with the symlinks that I want listing:
Code:
<Directory />
Options All
</Directory>
Im httpd-vhosts.conf, I have also placed the following code within the relative <VirtualHost></VirtualHost>:
View 5 Replies
View Related
Feb 17, 2010
I know it is possible to do... but I am not sure how to go about the whole thing. Here's the scenario. I run a lab. Lots of PCs. As time goes by, the older ones dont have the memory or disk space to run more modern apps. But I want to put them to use...
What I am trying to do, and have started, is the following: 1. Install Linux on a bunch of them, make a share on each of these. I've already installed FreeNAS on four machines. (Let's call these machines ClientA, ClientB, ClientC, and ClientD). And have made all the available diskspace
2. Install Linux on a fifth machine (call this Machine1) , and on this machine combine over-the-network all the shares from ClientA, ClientB, ClientC, and ClientD into one large "virtual" directory on Machine1. I know this is do-able, but what I hope to have is the total disk space from all the machines in step 1 to be combined for the purposes of saving files. Not sure which file system to use. For example, if all the other four machines have 2GB of space each, I want to be able to be able to save a 7GB file.
3. And then allow sharing of this one large directory using Samba.
4. Then allow lab users (not on any of the above mentioned machines) to be able to access the Samba-enabled large shared directory on Machine1 to read and write files. The user will have no idea where that the files[s] is/are not on Machine1, and that it maybe segmented in some way, nor should they care.
I understand the risks (if any one machine of ClientA, ClientB, ClientC, and ClientD goes down, lose probably everything). I am considering throwing mirroring into the mix (mirror Machine1's large directory), but that can wait.
So in the above scenario, what file system can I use on Machine1 to combine all the shares from ClientA, ClientB, ClientC, and ClientD to make one large "virtual" directory?
I've looked at UnionFS, but from my understanding while it combines directories, the maximum file size is the size of the largest share. Is this true?
View 3 Replies
View Related
Oct 6, 2010
I've got a script to recursively create symlinks in my home directory to my settings directory, to keep the files under version control. I would like it to skip files which are already symlinked via a parent directory. That is, if I have these files/directories:
~/foo/ -> ~/settings/foo/
~/settings/foo/
~/settings/foo/bar
, how do I check that ~/foo/bar and ~/settings/foo/bar are the same file?
Edit: D'oh, another few minutes of searching revealed the answer: readlink -f $path
View 3 Replies
View Related