General :: List Files Bigger Than Filesize Specified?
Jan 15, 2011How can I make ls (or any other particular command) to list me only files bigger than a specific file size?
View 1 RepliesHow can I make ls (or any other particular command) to list me only files bigger than a specific file size?
View 1 Repliesi'd like to know the total filesize of all files found with the find command, so
Code:
find -iname '*.mpg' | xargs -I {} du -sh {}
but this gives me the filesize of each file, since each line is passed to "du".
how can I pass the whole list through the pipe?
I have a zenity message box in a script
zenity --info --text='done' > /dev/null 2>&1
I need to pop up a message, e.g.: "file is smaller then 30 KBytes!" when a file is smaller then 30 KBytes. How could i write an "if then else" script to pop up a zenity message, when e.g.: "FILE" is smaller then 30 KByte?
How can I know the filesize of the downloaded file before downloading it?
Using Ubuntu/Fedora
I am trying to output md5 or sha1 along with fullpath/filename and file size but I dont seem to find a way to do this.
with
Quote:
find . -printf '%s %p'
i can retrieve size and fullpath and filename
however I am not able to merge that info with the md5 or sha1 of the file
my aim is to have a file such as this
6435b607f86b6e6be1e77bb3b1987677d1377275 ./abc/asda/file1.txt 404
6435b607f86b6e6be1e77bb3b987677d13772725 ./abc/asda/file2.txt 1404
also, performance is an issue for me, since i need to get the info out of 10m files (approx 6TB), so commands like find are preferred and less iterations among commands would be great too.
btw i've tried to use something like this
Quote:
find . -type f -printf '%s %p'| xargs awk '{x=system("md5sum "$2)}END {print x" "$2" "$1}'
but variable x contains the return value of the system command md5sum and not the stdout
I have uploaded a load of files and now found there a bug when i try to zip the file thats over 2gb in size. I am a quite a notice with this so need easy instructions on how i get it to work.
I followed this tutorial: [url]
And the error i get is Quote:zip warning: name not matched
I know I can do find . -type f, but that includes binary file and I couldn't find a way to exclude them with find
View 4 Replies View RelatedI can't delete any files bigger than 4 Go. I got a message telling me that my trash is full and I should empty it. But there is nothing in it. Is there any thing I can do to be able to delete files over 4 Go?
View 5 Replies View RelatedHere is the error I get from unzipiping a "file" that is approximately 20GB
Quote:
vlc media player seemed having trouble opening bigger files, say more than 400MB. And it would openavi files.
View 4 Replies View RelatedI am trying to get this script to work. The purpose is to download a list of modules from the slax.org the list consist of a list of module numbers. What I am trying to do is Download the file or the file name corresponding to the number in the list.the list is comma delimited. this is what I have done so far and I am a stand still.
#!/bin/sh
# Wget script to retrieve modules from slax.org modules
#
# ----Begin of user defined values -----
# Path to wget
[code].....
I have some files located in /vol0/archives that has several files Eg:- arch_00001.arc , arch_00002.arc, arch_00003.arcI want to tar each of those files into separate tar ball by taking it garbing it file name sequence,Eg:- arch_00001.arc.tar.gz , arch_00002.arc.tar.gz, arch_00003.arc tar.gzhow do I define the tar command to go get those files and tar each file separately, As I mention above
View 2 Replies View RelatedHow can I list (using ls) all files that are not empty (size > 0) using linux?
View 7 Replies View Relateduntar a bunch of files located in different folders, with folder deep unkown.Found an old post about this matter but the suggestion extracts all files in the same folder (your current).I wan't to extract files to the same folder as the tar file.The solution from the old post (extracts all files to current folder)find . -name "*.tar" -exec tar xvf {}
View 1 Replies View RelatedHow do I (in the command line) list all the .txt (or any common attribute) files in a directory?
View 4 Replies View Relatedhow to list only hidden files in current directory ?
View 7 Replies View RelatedI have run a command which finds a lot of files based on some search criteria. It returns the files like so:
./somepath/somepath/file.something
./asdf/asdf/s.php
./etc/a.php
./a/b/c/d/e/f/g.jpg
So I was wondering, if I capture this output into a file (ie. one file per line), can anyone help me write a command which iterates through the file and moves the files one by one to a specified directory?
I'm using ubuntu (natty), and when I use ls -l, the files are listed, but apparently the sorting algo ignores any special characters. For ages I've used underscores to mark special folders. And it seems to me, that they were always listed first. Now, the underscore is completely ignores. Let's assume that I have the files fileA, _fileB and fileC in a folder. Currently, ls -l orders them like so:
malbert@dredg:/tmp/1$ ls -l
total 0
-rw-r--r-- 1 malbert domain users 0 2011-08-03 15:27 fileA
-rw-r--r-- 1 malbert domain users 0 2011-08-03 15:27 _fileB
-rw-r--r-- 1 malbert domain users 0 2011-08-03 15:27 fileC
[Code]...
I've dug through the ls man page and could not find anything. Is there somewhere a system-wide collation option? Or something the like?
What I would like to do is to print the contents of all text files in a particular directory, recursively. Problem being that there are directories and possibly binaries scattered around in the filesystem as well.
Trying cat * works as long as there are no directories in there, but when there are it gives an error instead and prints nothing.
I'm sure it's easy using file -f or something but I can't figure it!
When I try to list files in directory. I am getting i/o error
#ls -l /test
I am getting i/o error. Why I am getting this error and what are these i/o errors.
My system is centos 5.5, and I need nobody:nobody's directories and files under data.
There is a directory named "data, and this directory has so many directories and files generated by web program. Most of them is nobody:nobody.
I want to to make a list of these nobody:nobody directories ans files.
Is it possible to make a list of these directories and files?
I want to know how much damage a user can do on my system if he decides to delete everything (or write to in case of corruption).What command or script might i use to check this?
View 3 Replies View RelatedHow to list only today modify files in Linux ? How to 'scp' the today updated or modified files to another server? How to list files with modified date in Linux ? Currently am using UBUNTU 10.04
View 2 Replies View RelatedI want to list recursively all files in given direcotry, with their fullpath and their timestamps.Something like this:10:30 Dec 10 2010 /tmp/mydir/myfileI've tryied with:find . -type f -exec ls -la {} ;but that don't give me the fullpath.
View 3 Replies View RelatedI have a file with joker character patterns:
./include/*
./src/*
etc.
From the current directory I would like to recursively get the list of files that do not match these patterns.
Given a single SMB network share (for example, \server\SHARED_FOLDER), I want to recursively list all the files, including those in the subdirectories (like find(1)).
I would prefer to do it in Linux, but I also accept Windows answers.
Is there any Linux application for finding the folders with the most number of files? baobab sorts folders by their total size, I'm looking for a tool that lists folders by the total number of files in it.
The reason I'm looking is because copying tens of thousands of small files is excruciatingly slow (much slower than copying a few large files of the same size), so I want to archive or delete those folders with high file counts that that will be slowing down the copying (it won't speed things up now, but it would be faster when I need to move/copy it again in the future).
How to find and list files and directories present the current directory which were created in, say, years 2005, 2006, and 2009 and then move them to some other location, for example, /backup. Yes, I need to list them and move simultaneously. We can use:
Code:
find . -mtime n {};
but that n is troublesome for me to figure out files/directories created in years 2005, 2006, and 2009, for instance. Is there any way to match exactly by Year Value rather than calulating the "n" (days * 24 Hours)?
System Info:
SunOS 5.8 Generic_117350-06 sun4u sparc SUNW,Ultra-Enterprise
I'm looking for a way to produce a list of all the directories in the current working directory sorted by the total number of files that are contained with them.
Initially I though that Nautilus could be used for this, but then I realised it doesn't count files in the sub directories.
The best I've got for a command line solution so far is this
Code:
The use case for this is a situation where a user has a quota applied to their home directory which limits the number of files they are allowed to have and they have exceeded that limit.
I am trying to add a line of text to hgrc files for Mercurial Repo'sThe file is found normally in a hidden .hg directory under the repo.I need to add a "deny_read = username" to the end of each of these hgrc files. Suggestions either in shell script format, or a single line?
View 4 Replies View Related