General :: Move Large Amounts Of Music Within A File Structure?

Dec 20, 2009

i have a car stereo that reads a USB drive with all my music on it, however to sort through the music it uses a method of finding folders containing music, then displaying them all in a list. i find this interface annoying because in order to sort the music by artist i have to go and manually move it out of the album folders by hand, this takes a long time for 11+ GB of music so i was trying to use the linux CLI to quicken the process. use a command like this

Code:

mv /media/usb/music/*/*/* /media/usb/music/*/

but for some reason this moves all my music into the last folder alphabetically in my drive, the music is all pre-arranged like this /media/usb/music/artist/album/song

View 5 Replies


ADVERTISEMENT

General :: Writing Large Amounts Of Data To Multiple CD/DVDs?

Sep 1, 2009

Are there any tools out there that let me select a bunch of data and burn it to multiple cd's or DVD's? I'm using k3b but have to manually select cd and dvd size amounts.

View 1 Replies View Related

CentOS 5 :: Move System And File Structure To New Box

Feb 7, 2011

I am new to Linux and not sure how to explain what I want to do, but I will give it a try. I have a system running CentOS 5.x on a system the is dying. Is there an easy way to migrate the system over to a brand new system that I recently purchased? I only have / and swap partitions, so nothing fancy; however, I have read that Linux is nothing like Windows when it comes to applications, and I could simply drag and drop files on the new server; however, I suspect that there is more involved than that. I hope I can just move the files over, and the system will boot; however, I am worried about new hardware on the new system. I am looking for recommendations to this issue. I am not sure if I have described it correctly; however, just point anything out that I need to change.

View 5 Replies View Related

Ubuntu :: Moving Large Amounts Of Files

Mar 6, 2010

I am trying to move a large amount of files (over 30k and 86GB) to another HDD but I get a Augment list too large error?? I tried rsync, cp, mv and still the same error

View 1 Replies View Related

Ubuntu Multimedia :: Prepare Large Amounts Of Images For Web?

Oct 27, 2010

I've been using GIMP's 'save for the web' tool to reduce the file sizes of images.

I now have a directory with about 50 images. I'd like to avoid processing them all by hand.

I have a (very) basic knowledge of programming, and I'm comfortable with the commandline. I don't mind doing some homework on how to use new tools.

All I'm really concerned with here is reducing the file sizes of the images I have.

What possible pathways are there for me to prepare large amounts of images for the web?

View 4 Replies View Related

Server :: MySQL Backup - Deal With Large Amounts Of Data?

Feb 15, 2011

we've been trying to become a bit more serious about backup. It seems the better way to do MySQL backup is to use the binlog. However, that binlog is huge! We seem to produce something like 10Gb per month. I'd like to copy the backup to somewhere off the server as I don't feel like there is much to be gained by just copying it to somewhere else on the server. I recently made a full backup which after compression amounted to 2.5Gb and took me 6.5 hours to copy to my own computer ... So that solution doesn't seem practical for the binlog backup.Should we rent another server somewhere? Is it possible to find a server like that really cheap? Or is there some other solution? What are other people's MySQL backup practices?

View 8 Replies View Related

Server :: PAE Kernel (2.6.18) Fails To Swap With Large Amounts Of Physical Ram?

Dec 8, 2009

We're load testing some of our larger servers (16GB+ RAM), and when memory starts to run low they are kicking off the oomkiller instead of swapping. I've checked swapon -s (which says we're using 0 bytes out of 16GB of swap), I've checked swappiness (60), I've tried upping the swap to 32GB, all to no avail. If we pull some RAM, and configure the box with 8GB of physical ram and 16 (or more) GB of swap, sure enough it dips into it and is more stable than a 16GB box with 16 or 32GB of swap.

View 6 Replies View Related

General :: No Space On Root After Large Move Operation?

Feb 16, 2010

I tried to move 2.7TB of data from my /var/webroot/ partition (4.5TB total in size). I left it to run over night, this morning when I came to check I saw that all data on / paratition is used up and no operations can be done because of the "no space left on device" message.

Code:

Filesystem Size Used Avail Use% Mounted on
/dev/cciss/c0d0p7 911G 911G 0 100% /
tmpfs 7.9G 0 7.9G 0% /lib/init/rw

[code]....

I freed up several hundred MB from / but still the usage is at 100% and I cant free up any more space or complete the transfer ?

View 1 Replies View Related

General :: Copy Certain File Types Recursively While Maintaining File Structure On Destination?

Jun 14, 2011

I have just been bothered by a fairly small issue for some time now. I am trying to search (using find -name) for some .jpg files recursively. This is a Redhat environment with bash.

I get this job done though I need to copy ALL of them and put them in a separate folder BUT I also need to keep the order intact after copying.

For e.g - If I get a JPG file under /home/usr/new/1/ then the destination also needs to be /test/old/new/1/.

At the moment, I am simply putting all files under /test/old/ and I can't somehow get the later /new/1/ folder path created under /test/old/

I understand this could well be done using while OR if else loop, though if someone can just guide me with a hint, I would be really grateful.

I will complete the rest of the steps and was asking here since I am still not comfortable with the shell/bash scripts yet and planning to be really good at it over the next couple of months.

View 1 Replies View Related

General :: Quick File Structure Question For Any User?

Aug 10, 2009

I have used linux for a web server but only installed a couple items on top of the OS but would like to begin using linux more often on my own home machine. However, I also like to keep things clean and organized, and know what is going on when I do something. I have some sample C scripts for network programming, and they came as a downloadable package with a readme including the make / configure instructions to get it all set up, and then I can compile individual scripts as needed.

I was wondering - when I run make and those first few commands - where does it all go? Will all the new activity be confined to the folder I am in, meaning I can easily remove it all by simply deleting the folder when I am done (I won't want all this sample networking stuff forever, you know). Or, does it get placed into other directories throughout the file system?

I know when installing some apps that the files are placed in directories such as usr/bin and the like - my assumption is this happens when running make and make install commands - if so, how do we get rid of them when finished?

I just want to keep the system somewhat clean if possible, and the very least like to know what is being installed and to where - and have the option to remove it easily at a later date if I choose to do so.

View 2 Replies View Related

General :: Creating A Directory Structure - And Setup File Security?

May 15, 2010

jump into a Linux class in college with only 3 weeks left in the course. I thought I would be able to catch on, and go figure, it didn't exactly happen that way. I was given an assignment to do, and I am so far lost it isn't even funny. I need to create a directory structure, set up file security, create a step by step instruction manual on how to copy/delete said files, and create a guide to common Linux commands. How would I create these files in root and share them with the other users? and where can I find a list of common commands and their functions?

View 5 Replies View Related

General :: View A Particular Ten Lines In A Large File Where Can't Open The File In Vi

May 12, 2010

I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.

View 1 Replies View Related

General :: Copy File Recursively Ignoring Destination Directory Structure?

Jul 8, 2011

I have the following content on the source directory:

source/foo.txt
source/bar/another_file.txt
source/bar2/and_another.txt

I want copy those files to a destination directory which, after copy, shall look like this:

destination/foo.txt
destination/another_file.txt
destination/and_another.txt

How can I do this? It seems that "cp" lacks such an option

View 1 Replies View Related

Ubuntu Multimedia :: Music Player/jukebox For Large Collection?

Mar 29, 2010

What would be the best music jukebox that doesn't lag when there is a lot of music in the library (400+ gigs)?

I have tried songbird, but its pretty laggy. I like to import all my music and turn on shuffle and go through my songs.

View 3 Replies View Related

Slackware :: Amarok Hangs At 10% When Scanning Large Music Collection?

Jun 6, 2010

After installing slackware 13.1 I start up amarok and when I go in and configure the settings and it starts to scan the folder and it either hangs at 10%, stops responding all together or crashes, the library is about 130 gigs of mp3s. I do not know where to start on this one. Amarok version 2.3.0

View 4 Replies View Related

Ubuntu :: Find And Move Commands On Large Number Of Files ?

Feb 21, 2011

We recovered a large number of files from a HD I messed up. I am attempting to move large numbers of files of a type e.g. .txt .jpg , into a folder by type to more easily sort through them.

Here are the commands I have mainly been trying with various edits:

Code:

Code:

So far the most common complaint I have gotten "missing arguments to execdir".

This is on Ubuntu 10.04

View 6 Replies View Related

General :: Using Lynx To Create Html Page With The File Structure Of A Local Directory

Oct 10, 2010

I'm working with a dual-boot laptop running Ubuntu 10.0/Windows 7 and a Debian 5 VPS while the OS's shouldn't have much impact on my question.

What I would like to do is create a html page that I can upload to my VPS which lists all of the files/folders on my local 2TB hard drive (Specifically media such as Movies, Music, TV Shows...). The media obviously will not reside on the server, but I would like to at least have a list which will allow me to select, for instance, a bands artist so that it redirects me to the albums in the directory below.

Ultimately, I'm looking for Open Directory Browsing without actually having the media on my server. I have been attempting to create something to this effect using lynx, however, I'm not sure if it can be done with this command or if it's even possible for that matter.

View 1 Replies View Related

Debian Installation :: Dividing A Large Upgrade \ Move From Squeeze To Unstable?

Oct 6, 2010

Trying to move from squeeze to unstable -- my downloads add up to some 700 M or so.So I am trying to batch the upgrade:Some of the big-fellas are openoffice and texlive:So I didsudo aptitude hold '?name(openoffice)'sudo aptitude hold '?name(texlive)'Is that fine or are there some pitfalls to this?

View 5 Replies View Related

General :: Copy A File With Certain Name Pattern For Which Exact Place In Complex Directory-structure Unknown?

Sep 19, 2011

I want to copy all files with the name XYZ* into one folder. The problem is that the files are in different subfolders and that not even the depth of the folder structure is the same for all files. Luckily, at least each file has a unique name.

Of course, I thought about the cp command but I guess the depth of the folder structure needs to be the same for this to work.

View 3 Replies View Related

Ubuntu Multimedia :: Recommend A Good Player For A Large (~900GB) Music Collection?

May 7, 2011

i've recently installed 11.04 and am giving banshee a shot. it seems pretty good although has crashed a few times. but when i import my music folders (about 900GB, 175,000 items .. and growing..) it takes days. that's not such a big problem because it only needs to be indexed once, i assume, ... but the UI is very slow now - so that clicking on an album can take several seconds to bring up the tracklist. also typing into the search box there is a large delay before the matches are shown. is this just to be expected for such a large db? i recall i had google desktop indexing and returning results almost immediately back on winblows.. other ones i have tried include rhythmbox , amarok, songbird .. but have not found any of them to be stable and simple enough to my liking.

can anyone recommend a good player , my essential requirement is a fast and efficient indexing - with tag support would be grand, but just based on filenames is ok too. drag and drop into a play queue like rhthmbox had would be nice.

View 9 Replies View Related

General :: Best Way To Copy A Large File Over NFS?

Aug 24, 2011

I want to transfer a huge file (60GB) over the NFS network on linux. Is cp the best option?

View 1 Replies View Related

General :: Can't Copy Large File?

Mar 26, 2010

I'm trying to copy a 6Gb file across from my laptop to an external usb drive but it quits at about 4.2Gb every time with a "file size limit exceeded" error. i have checked the output of ulimit -a and there is no limit there on the file size. I'm using the Slax live Cd for this as it always gets the job done

View 8 Replies View Related

Ubuntu :: Best Way To Move Music From Itunes?

Feb 9, 2011

I want to move all my music from my mac to my ubuntu laptop and change the format to ogg. I am looking for advice on the best(best=easy) way to do this. My music is in MP3 and AAC formats now.

View 5 Replies View Related

General :: How To Send A Large File Securely

Aug 28, 2011

I need to send large files from a Linux machine to another using cryptography. The sender machine knows the recipient IP but not vice-versa. I don't need strong cryptography and prefer higher-speed less-secure solutions.

There are no problems with presharing crypto keys but I'd prefer not dealing with SSH users creation.

I think to HTTP PUT over TLS, but I never had experience with it and I prefer to hear which are the possible solutions. I know that it can listen as a daemon but I don't know anything about cryptography. So pipeing with OpenSSL may be a solution.

View 2 Replies View Related

General :: Using Sed To Replace A *large* Number Of Variables In A File?

Jul 28, 2011

I have a large number of log files, on a linux box, I need to cleanse sensitive data from before sending to a third party. I have used the below script on previous occasions to perform this task, and it has worked brilliantly (script was built with some help from here :-)

#!/bin/bash
help_text () {
cat <<EOF
Usage: $0 [log_directory] [client_name(s)]
EOF

[Code]...

However, now one of our departments has sent me a CLIENT_FILE.txt with 425000+ variables! I think I may have hit some internal limit. I have tried splitting the client file into 4 with around 100000 variables in each, this still doesn't work. I'm loathe to keep splitting though as I have 20 directories with up to 190 files in each directory to run through. The more client files I make, the more passes I have to do.

View 2 Replies View Related

General :: Compress A Large File Into Smaller Parts?

Aug 18, 2011

I'm looking for a way to compress a large file (~10GB) into several files that wont exceed 150MB each.

Any thoughts?

View 2 Replies View Related

General :: Monitoring Copy Progress Of A Large File?

Sep 15, 2010

Is there a clever way to monitor the progress (as percentage or hash) of copying a large file (using pv could be an option)?Like monitoring the progress of a copy command such as this:Code:cp linux.iso /tmp/

View 2 Replies View Related

General :: Backup Large File To Multiple DVDs

Nov 2, 2009

I work for a school consulting company.We helped a school deploy about 1500 computers.The computers have windows XP but we have been using G4L for the restore partition on the drives.So far the software works great. We did however run into a problem in that many of the computers we deployed are missing the restore partition. The reason they are missing is long and convoluted and not really that important. What I have been charged to do is try and fix the restore partition problem. One solution that I had, which im not even sure if it will work, was to backup the recovery file, that g4l created, to DVD and write a basic script to recreate the partition and then copy the file over. This process would need to be as automated as possible since this disc will be inserted by the end user(the students). The backup file that g4l created is 5.9GB so it wont fit on just one disc and Dual layer discs are too expensive to use for this project, so the file will either need to be compressed again (not sure if that's a good idea or not) or split across two DVD's.

I have searched the forums here and I was not able to find anything to fix this problem. I was able to find some info on splitting files across two discs but im not sure how to use that to fix my problem.

View 5 Replies View Related

General :: Cannot Find Large Untared MySQL File

Aug 6, 2009

After untaring a mysql file (very large) I'm trying to find where the file listed below has gone. I did a search on the file name:
fine / -name 'mysql-qui-tools-5.0' -print
But can't find the file.
-rwxr-xr-x root/root 9651820 2007-05-02 11:46:01 mysql-gui-tools-5.0/mysql-query-browser-bin

View 6 Replies View Related

Red Hat / Fedora :: Move Music Folder To A Different Hard Drive?

Aug 31, 2010

Am I able to move Music folder to a different hard drive without moving my whole /home or do I just make a folder there and call it Music and change all the programs that use the music's default music folder?

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved