Ubuntu :: X Number Of Files Not Upgraded ?

Apr 9, 2011

I have Ubuntu 10.10 with three sessions, KDE, Gnome 3 and regular (along with its safe mode and all that jazz).

When i do: apt-get upgrade (after checking for updates) it will say something like 105 upgraded, 0 newly installed, 0 removed and 74 not upgraded....

How do I get rid of that 74 not upgraded or force them to upgrade?

View 2 Replies


ADVERTISEMENT

Ubuntu :: Command With The -r Option To Compare A Large Number Of Files And Files In Subdirectories

Jun 16, 2011

I am using the diff command with the -r option, to compare a large number of files and files in subdirectories. My main interest is to find out which files have been changed, and not what the actual changes are, and since a lot of files has been changed, it would be a lot easier to view the file names only. Is there and option for diff that might do this, or does there exist a similar tool/command that could do the job?

View 1 Replies View Related

Ubuntu :: Upgraded To 11.04 From 9 And All User Files Are Gone

Sep 1, 2011

I just installed Ubuntu 11.04, i chose the option to upgrade from 9 (Jaunty) and keep all my user files. I am now in 11.04 and they're all missing, is there any way for me to get all my data back? Does Ubuntu store this data somewhere whilst upgrading? Before you say it i know i should have backed up all my stuff first.

View 7 Replies View Related

Ubuntu Installation :: Register DLL Files In 9.10 (upgraded)?

Jan 13, 2010

I need to register some dll files in ubuntu. I used following command n it gives an error,

HTML Code:
Z:home haraka>regsvr32 "C:Program FilesUnion Assurance HRMAlerts.dll"
err:module:import_dll Library MSVBVM60.DLL (which is needed by L"C:\Program Files\Union Assurance HRM\Alerts.dll") not found
Failed to load DLL C:Program FilesUnion Assurance HRMAlerts.dll

View 3 Replies View Related

Software :: Join 2 Text Files Based On First Number Present In Every Line Of The 2 Text Files?

Jan 22, 2010

I have 2 text files : file1.txt and file2.txt

cat file1.txt

15 this is a sentence containing various words and spaces
34 this is a another sentence containing various words and spaces

cat file2.txt

2 this is sentence1file2
6 this is sentence2file2
54 this is sentence3file2

I would like to join these 2 files. The result should look as follows :

cat joinedfile.txt

2 this is sentence1file2
6 this is sentence2file2
15 this is a sentence containing various words and spaces
34 this is a another sentence containing various words and spaces
54 this is sentence3file2

==> so the joined file must be sorted on the first number. Any ideas how this can be achieved ?

View 4 Replies View Related

Ubuntu :: Rename Files - Add Number To All In Folder ?

Feb 14, 2010

Getting together a script that will add numbers to all the files in a folder.

I've ripped most of my CDs to oggs for my new pmp, but I found that the pmp doesn't like files that are numbered just as 1 and 2, as it thinks that the 2 is more than 10.

So instead of going through all of my music folders and renaming every file by hand from 1 to 01 and from 2 to 02, I'd ask if there's a script that can be executed to add these numbers for me. It'd be even better if it only added the number to the files with only one digit.

Here's an example:

I want to rename:

And I'd like to do it to all single-digit files lower than 10 in the folder, if possible. If not, I can isolate them by hand.

View 4 Replies View Related

Ubuntu :: Modify Number Of Recent Files In Gedit ?

Apr 27, 2010

Is it possible to change the number of files that are displayed by the recent files list in gedit? Running Ubuntu 9.04x64. Gedit says it's 2.26.1

View 2 Replies View Related

Red Hat :: Files Have Owner As Uid Number Rather Than Username?

Nov 18, 2009

On my RHES4 I noticed a load of files which had owner set as the owners uid rather than the actual username - is this usual behaviour ? On a similar system the same files actually have the username as the owner.It's just causing me issues as I have changed the users ID and now some thing's wont start meaning I have to manually do a find and chown on the system.

View 4 Replies View Related

Ubuntu :: Copy A Line By Number In Text Files On Terminal?

Apr 10, 2010

I got a text file. Now I want to write a line by its number in a bash variable. That is all.

View 1 Replies View Related

Ubuntu :: Data Loss When Transferring Large Number Of Files?

Jul 20, 2010

This problem is not exclusive to Ubuntu, I've experienced it in Windows and OSX as well, but it seems that almost every time I transfer a large number of files (i.e. my music collection) between my desktop computer and laptop via my external hard drive, I end up losing files for no reason. I usually don't notice the files are missing until later on, because I am never informed of any data loss. Now, every time I make a large transfer of files, I just do it two or three times to ensure that I don't lose any files.

View 2 Replies View Related

Ubuntu :: Find And Move Commands On Large Number Of Files ?

Feb 21, 2011

We recovered a large number of files from a HD I messed up. I am attempting to move large numbers of files of a type e.g. .txt .jpg , into a folder by type to more easily sort through them.

Here are the commands I have mainly been trying with various edits:

Code:

Code:

So far the most common complaint I have gotten "missing arguments to execdir".

This is on Ubuntu 10.04

View 6 Replies View Related

Ubuntu :: Deleting Files Based On Number Created (or In Folder?)

Jul 21, 2011

I'm dipping my toes into some bash scripting and was wondering if there was a way to delete a file not based on how old it is, but rather how many other files are currently in the folder... or something to that effect....

What I'm doing is creating a script to back up a folder nightly. I'd like to keep a maximum of 3 backups. However in case the script for some reason fails to run one night (computer turned off possibly) I don't want to set the condition for deletion to be the date.

I know that if I run:

Code:
find /path/to/files* -mtime +3 -exec rm {} ;
that it will delete everything older than three days. -atime and -ctime don't seem to be what I"m looking for... is there another command I can use to achieve what I"m trying to?

View 5 Replies View Related

Fedora :: Maximum Number Of Files In Ext3 ?

May 26, 2010

Is there a limit to the number of files ext3 can support?

Reason I'm asking is because on one of my internal drives, I have around 750,000 files. The drive is 500Gb and currently using 150Gb... I noticed recently that when I try to copy a new directory or file, the transfer rate is extremely slow at times. It is sataII and sometimes it gets as low as 500kb/s (yes, kb!)

Would somebody please shed some light?

I noticed it might be related to the process gvfsd-metadata

I'm using Fedora 12 64-bit

The transfer is from an ext3 to ext3 filesystem.

View 3 Replies View Related

Fedora :: How To Limit The Number Of Files In A Particular Directory

Aug 31, 2011

I was nosing around in my /home folder and I noticed that the /.thumbnails directory had 38,000+ files in it. That number seem a bit excessive to me. Is there a way to limit the number of files that are allowed to be in that directory, and maybe delete the oldest files automatically when the directory reaches it's limit in order to make room for the new incoming files, so there are no "directory full" type of errors?

View 8 Replies View Related

OpenSUSE :: Maximun Number Of Files Per Directory ?

Apr 28, 2011

If there is any maximum allowed number of files per folder in linux (without risking it to lose everything).

I am using openuse 11.4 with latest kde (4.6?).

I am trying something fast and dirty and it might be that one folder will contain like 10^6 files.

Is there is anything I should be warned about that? Of course filesystem security is still the most important.

View 9 Replies View Related

General :: Maximum Number Of Files Per Folder?

Apr 28, 2011

I would like to ask you if there is any maximum allowed number of files per folder in linux (without risking it to lose everything). I am using openuse 11.4 with latest kde (4.6?).

I am trying something fast and dirty and it might be that one folder will contain like 10^6 files.

Is there is anything I should be warned about that?

View 10 Replies View Related

General :: Find Number Of Files In A Directory?

Feb 22, 2010

i need to know how to find number of files in a directory? is there any system calls in fedora 12.And i need to know how to perform a operation if the that count increases by one?

View 14 Replies View Related

General :: Merging Files With Different Number Of Rows Using Awk?

Apr 18, 2011

Does anyone have a solution for merging files if the number of rows in the two (or more) files is non-equivalent.To exemplify, how about merging the following 3 files:

file1:
1
2

[code]...

View 8 Replies View Related

General :: Tar Not Working With Large Number Of Files?

Dec 6, 2010

in my middle of script i am using tar command to tar some 1000 images and size of each image is 5MB.

all the images are provided as argument as tar -cvf file.tar <all images as argument>

but my tar file file.tar does not contain all the images.

View 4 Replies View Related

Ubuntu :: BASH Script - How To Stop Loading Files At Arbitrary Number

Aug 9, 2010

I have the following code in bash script:
Code:
#!/bin/bash
COUNT=1
# bash until loop
until [$COUNT -gt 2]; do
pq A$COUNT [Pemptus].pq &
let COUNT=COUNT+1
done

I did this because I'm that much of a Progress Quest geek that I wanted to have a huge group on the online server, so I decided to make a script that would open all the files for me rather than having me do it manually. I created some characters with the boring name of A1, A2, etc. When I ran the above script, it went into a continuous loop and I had to halt it, then run sudo killall pq.exe to eliminate the 500 or so Progress Quest windows that popped open. Anyway, what is wrong with my script that I can't seem to get it to stop loading files at an arbitrary number? I want to get this part finished before I make any more boring named characters.

View 1 Replies View Related

Ubuntu :: Sequentially Number Files Based On Date Modified - Rename Cli

Nov 8, 2010

Sequentially number files based on date modified (rename cli)

I'm almost done a larger script which takes all the pictures in a folder, converts it to video, and emails it to me. Everything worked fine until I realized the picture filenames weren't always starting at 1, then ffmpeg chokes.

I have a bunch of files in a folder which I need to rename to:

I don't want to install any additional packages and I'd like this to run in a single command if possible.

If not possible, then a bash script would work too.

View 3 Replies View Related

Ubuntu :: Large Number Of Files To Count - Argument List Too Long

May 28, 2011

I have the standard problem of trying to count a large number of files in a directory (>100k)

I have tried: ls ~/user/images/* -l | wc -l and find ~/user/images/* -maxdepth 1 -type f | wc -l

In both cases, I get the argument list too long error message.

I have tried using xargs but I can't seem to get it to work right.

The command

returns a valid answer but it includes all the subdirectories in the file count.

View 4 Replies View Related

Debian :: Resync Hangs After Copying A Certain Number Of Files

Nov 9, 2015

I currently have a problem in running rsync on 64 bit Debian Jessie (although the problem also occurred with 64 bit Debian Wheezy)I am trying to use rsync to archive my home directory (which is on a hard disk) to a USB memory stick. The home directory is about 18GB in size and the memory stick has 32GB.

Unfortunately, rsync hangs after copying a certain number of files and the process eventually has to be killed. Rsync was rerun but hung again at about the same point as before.This has now happened several times. Each time the hang occurs at about the same point.Use of strace after the hang shows that rsync appears to be processing a pdf file at the time, although not always the same pdf file.I originally had the rsync hang problem on a PC which ran 64 bit Wheezy and which used a USB 2.0 port.

I now am running rsync on another PC, which runs 64 bit Jessie and which uses a USB 3.0 port..I have also tried three different USB sticks, two from one manufacturer and the third from another manufacturer.All give similar rsync hangs.

View 11 Replies View Related

Fedora :: Script To Join Number Of Jpg Files Into Bound Pdf

Jun 24, 2010

The other day, I needed to send the bank few signed documents (~40 pages). Scanned the signed documents in jpg format and wrote up this script to make a bound pdf. I find it quite useful- have fun.

View 1 Replies View Related

General :: Copying Large Number Of Files From One Directory To Another

Feb 10, 2010

I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like

for file in ls *; do
cp {source} to {destination}
done

then because of ls command , its performance degrades.How can I do this?

View 7 Replies View Related

General :: Copying Large Number Of Files In Windows?

Mar 15, 2011

I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.

View 3 Replies View Related

General :: Unix - Remaining Number Of Open Files?

May 11, 2011

ulimit -a tells me I have a limit of 1024 open files, which is the default on my distro. Is there a way to show how many of these are currently used, or how many are remaining?

View 1 Replies View Related

General :: Count The Number Of Lines In All Files In This Directory?

Jul 5, 2011

I want to count the lines of all files in this directory and all its subdirectories, but exclude directories "public", "modules", and "templates".

View 2 Replies View Related

General :: List Folders By The Number Of Files Recursively

Aug 11, 2011

Is there any Linux application for finding the folders with the most number of files? baobab sorts folders by their total size, I'm looking for a tool that lists folders by the total number of files in it.

The reason I'm looking is because copying tens of thousands of small files is excruciatingly slow (much slower than copying a few large files of the same size), so I want to archive or delete those folders with high file counts that that will be slowing down the copying (it won't speed things up now, but it would be faster when I need to move/copy it again in the future).

View 5 Replies View Related

General :: Share A Large Number Of Files Into Chroot Env?

Aug 17, 2010

I understand that chroot is usually used to provide security, however, for my issue, security is a big don't care. I am very new to using chroot and don't fully understand how the chroot'd env works.

problem: Trying to use a vendor supplied cross compile environment. The environment runs as a chroot'd env and works just fine. I have a large number of additional modules that I wish to compile in the chroot'd environment. FYI, these modules are also (succesfully) compiled for other targets not using chroot'd env's. Copying the source files into the the chroot environment is not an option (don't have hours to wait for copies to finish and it would break the make system). Having them live in the environment is also not an option (the chroot build is a tiny part of the build process and we cannot revamp our entire source tree to accommodate it).

I am looking for a way to have the compiler in the chroot'd env have access to a path that is outside of the env and typically higher up in the same path that holds the chroot'd env. I have tried soft links (they don't work as expected). Hard links only work for single files and there are 10's of thousands of files that would need to be linked. I am not sure how I would go about exporting the additional files and then mounting the exported files in the chroot'd env (or if that would even work).

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved