Red Hat :: File Size Varies With Same 'ls -lh' Command Ran On Two Different Versions?
Oct 22, 2010
I ran ls -lh for same tar ball file on RHEL 3 and RHEL 5.3 box. The sixth column of ls -lh output threw 6.3G on RHEL 5.3 box and 16E on RHEL 3 box. Both the machines have ext3 file system.
Find below the output for RHEL 3 and RHEL 5.3 respectively :
2.0T -rwxr-xr-x 1 root root 16E Oct 20 10:34 bac.tar.bz2
6.3G -rwxrwSrwx 1 root root 6.3G Oct 20 10:34 bac.tar.bz2
Do we know how 'ls ' deals with larger numbers ?
View 6 Replies
ADVERTISEMENT
May 25, 2010
When i load the local .csv file into mysql server the date format varies.Its mis matching.How to solve and match this.
View 1 Replies
View Related
Apr 19, 2011
I am trying to figure out the actual size of files and directories on a CentOS Linux 5 server and when I do a ls -l I see for example at the Directory of /Data 4096 but once in side the directory and I do a ls -l I see larger file sizes. How do I get the actual file size of a Directory to show up?
View 3 Replies
View Related
Jan 14, 2011
I've spent hours trying to scan + shrink a multipage PDF documentlosing readability. This is the first time I've ever needed to do this! (I had to scan each page as ".jpg" in order to email and open on another computer, so I could not scan to PDF directly, which I think is why each page was so large; lower DPIs made the text too blurry.)I found this great tip on UbuntuGeek...but anyone can do this if GhostScript is installed:
View 1 Replies
View Related
Jun 10, 2010
Are there software that can split big file size into small file size in Linux?
View 1 Replies
View Related
Jan 19, 2011
is lvresize with --resizefs options re-size the Logical Volume and then re-size the file system? i mean we don't need to use resize2fs?I looked at man pages but it doesn't explain this option.
View 3 Replies
View Related
Dec 14, 2010
How can we find the maximum size of the inode table and what decides it, and how the maximum size of volume of file system is decided ?
View 4 Replies
View Related
Sep 8, 2010
Like i'm curious what version of wine i have installed...What are my current ati drivers installed...If wine isn't 1.2 or 1.3 how do i update it from command line?really any insight into this process would help its not absolutely critical to know but i've been looking around and haven't found informationOr some references to good articles to become a command line guru would be cool as well.
View 8 Replies
View Related
Mar 31, 2016
Quick background: the office in my shop is just a partitioned area so is subject to temperature and dust, just as the workshop area is. I've upgraded my desktop from an AMD Phenom II X4 955 which used to suffer overheating problems (despite regular cleaning) to an AMD FX-6300 on a Gigabyte 970A-D3P motherboard. For cooling I've fitted a Cooler Master Hyper TX3, plus various fans. Fresh install of Debian 8.3.
The cooler has certainly solved the overheating issues and the machine runs very quietly, rather than sounding like a 747 at take-off. But I'm now having some problems getting consistent reporting on temperatures.
The BIOS reports temperatures from the CPU which seem to be fairly consistently in the mid to high 30s (C).
lm-sensors and hddtemp have been installed.
sensors-detect reports (just the last section)
Code: Select allNow follows a summary of the probes I have just done.
Just press ENTER to continue:
Driver `k10temp' (autoloaded):
* Chip `AMD Family 15h thermal sensors' (confidence: 9)
Driver `fam15h_power' (autoloaded):
* Chip `AMD Family 15h power sensors' (confidence: 9)
[Code].....
psensor identifies AMD CPU and the NVidia GPU, apparently correctly, as it does the HDDs.
So, my problem is that the CPU temps reported by psensor and the panel app vary quite significantly with those that are reported in the BIOS. My thinking is that the BIOS is correct and that the software is either misreporting the temperatures or using the wrong sensors.
On the old machine I used to get a temperature from each core, just like in the BIOS, but now I'm only getting a single CPU reading.
View 14 Replies
View Related
Oct 30, 2010
My machine has 4 SATA 2 West Digital 1TB disks. I made 2 partitions on each of them, 500GB for each partition. When I started using them I check their I/O using iozone. The first partition has 100MB/s for read, 70MB/s for write. And the second partition has 80MB/s for read, 55MB/s for write. All 4 disks has the same result.
As I use on, the I/O speed on each partition decrease, to different extend. For example, for the 4 first partitions, the write speed varies from 69MB/s to 56MB/s. And I have same amount of data on each of them, all used 11%.
My guess for this is the disk block allocation policy. This is caused because some disk starts writing from inner location while others writes on the outer edge, even though amount of data on each disk is the same.
View 3 Replies
View Related
May 29, 2010
I'm having problems with my wireless. It connects without any problem, but it is very slow. Actually I have the impression that the connection velocity is varies from normal to very slow, however the average connection velocity is very slow. I have been looking up the forums for more the one month with any success. I have tried to install the compac-wireless drivers and the kmod-wl with many different network setups. For sure it is not a problem with my home network, since it works fine with my other old notebook running fedora 11 or when I boot in the windows.
My system configuration is:
- sony vaio VPCCW23FX
- card: Atheros AR9285
- driver: ath9k
- operating system: fedora 12 - kernel 2.6.32.12-115.fc12.x86_64
View 2 Replies
View Related
Jun 16, 2011
I am trying to install a module in openerp 6 It requires Pyuno that ships with libre/open -office If i go import uno -- no module called uno but when I go to the /usr/lib64/libreoffice/base3.4/programs and python import uno it works
I cant copy the file to python as the versions differ
View 3 Replies
View Related
Jun 30, 2011
I'm not to clear on the difference between LTS versions and other versions, but think I may want to go with LTS. Can someone tell me if my thinking is correct given the following situation: I have some very cool, but very expensive software installed with a group license from my school, a school which I am not going to be attending for too much longer. So I want to go as long as possible without reinstalling Ubuntu, because once the product is licensed it will be licensed until I reinstall Ubuntu (or I uninstall the program). So I think this is going to require me keep the Ubuntu version I install as long as possible.
So in this case, should I go with 10.04LTS or should I just install Natty Narwhal and keep that as long as possible? It looks like 10.04LTS will be "supported" longer, but I'm not exactly clear on all that "supported" entails. Presumable it means security and software updates will be available for 10.04LTS for much longer than 11.x versions? So I'm thinking I should go with 10.04LTS
Is my thinking correct in going with 10.04LTS? Edit: It was pointed out that this would be against my contractual agreements. Which I suppose is probably true.
View 8 Replies
View Related
Apr 22, 2011
I am curious if perhaps I am doing something wrong extracting pages from a pdf doc using pdftk and creating a new file. I am only extracting the odd pages from the file and outputting them to a new file that is now only 20 pages instead of the input's 40 pages, yet the new output file is still 1.4Mb in size, the same as the original.
It seems strange to extract only half the pages of a large document and end up with a result that is the same size. how to streamline the resulting pdf's using pdftk?
BTW this is the command I am using, in case perhaps I am missing an option to optimize file size or something:
Code:
pdftk A=ch15.pdf cat A1-40odd output odd.pdf
View 1 Replies
View Related
Feb 23, 2009
I'm researching about symbolic links been used with samba / CIFS:I'd like that the user that uses a MS-Windows OS could see my shared folder on CentOS 5 and the symbolic links that are inside this folder. Well, it works but, the user will see that the size of the file is bigger than the real file. Apparently, CIFS gets the size of the symbolic link (aproxim.32K) and add it to the size of the file.Example 1: 100KB file, used with shared folder, MS-Windows's user will see 100KBExample 2: 100KB file, used with symbolic link inside a shared folder, MS-Windows's user will see 132KB. (Sym link + size of file)Is there a way to allow the user only see the size of the file, and not the file + symbolic links ?
View 1 Replies
View Related
Jun 13, 2011
I was just testing specifying limit on file size to a user and have added the following to /etc/security/limits.conf bob soft fsize 100 This basically should have said not to allow bob to create anyfile greater than 100Kb in size.
But the interesting thing is, if bob already has any file which is greater than 100Kb in size, it even doesn't allow to log him into the system both from console and SSH. Also nothing is logged in logs.. How do I configure it so that, bob can login to the system even though he has any file greater than 100Kb (but doesn't allow him to create file which are greater than 100Kb) ??
View 3 Replies
View Related
Jul 12, 2010
We have some large files with sampling data in it. Don't want to delete these files. But want to quickly overwrite the file with 0s and/or 1s and preserve the original file size.
View 3 Replies
View Related
Jun 22, 2009
I upgraded from Fedora 10 to Fedora 11, and it seems that when I saved my main.cf file, something changed in the versions and it no longer works. I've tried going through the file and tweaking it, but it's not working. Here is the postconf
[Code]...
View 7 Replies
View Related
Mar 1, 2010
In a book, I read tha cmchk command is used to get the disk block size. But in Ubuntu, it is not allowed as command is not available.Can some body tell me what is its equivalent in Ubuntu.
View 4 Replies
View Related
Jun 18, 2010
I am using DD to backup entire system partitions and now I am trying to restore one. The resulting disk image from my buggy process has zero bytes. D'oh.It apparently thinks the image was trailing garbage and ignores it. It deletes the original file and replaces it with a zero byte .dd file. I have the original copy of the image in a dd.gz file. It's 6.3 GB so it may still contain the data.How do I get the original image back without destroying it again?
View 1 Replies
View Related
Aug 20, 2011
i'm calculating the size of directories using "du"
PHP: function du($path)
{
exec ('du -sb ' . $path, $res[]);
$size = $res[0][0];
[Code].....
But then again, it doesn't calculate the actual file size, but rather a size aligned to 1024 bytes just as Windows does that with 4096 bytes cluster size. Is there a way to calculate the actual file size? eg. 1021 bytes
View 7 Replies
View Related
Nov 18, 2010
I'm trying to do something like this:
Code:
#!/bin/bash
cmd1=$(cat /var/log/messages | grep -e 'blocked for more than 120 seconds' | cut -c 55-62)
if $cmd1 != 0; then echo 'okay'; fi
however i'm messing up somewhere... bash attempts to evaluate the elements in cmd1. when I try to run this script it complains saying:
Quote:
test1.sh: line 5: blocked: command not found
I am open to alternatives. My intent is to replace cat /var/log/messages with dmesg, so I can attempt to determine if a problematic application I use encounters a blocked state (unresponsive for more than 120 seconds).
Should I be using a different test condition? I tried something like:
Code:
# this declares cmd1 as an array
cmd1=($(cat /var/log/messages | grep -e 'blocked for more than 120 seconds' | cut -c 55-62))
#attempt to determine if number of elements in array is greater than zero
if ${#cmd1[@]} > 0; then echo okay; fi
But I get the same error... what am I doing wrong?
View 3 Replies
View Related
Sep 4, 2010
How do I get the L2 and L3 cache size?
View 4 Replies
View Related
Oct 26, 2009
I want to know is there a command to find size of a folder.
View 4 Replies
View Related
Sep 29, 2010
I know these folders each have >80gb of files. Yet, they only show 4.0K in ls -lah? How can I have ls show size including the contents?
[root@aapsan01 aapxen01]# ls -lah
total 48K
drwxrwxrwx 6 root root 4.0K Sep 29 03:45 .
drwxrwxrwx 15 root root 4.0K Sep 27 09:15 ..
[Code]....
View 4 Replies
View Related
Apr 24, 2010
I'm trying to find a command or program to show what files and folders are taking up the most space on the hard drive, much like tree size view on windows, is there and equivalent on linux?
View 2 Replies
View Related
Apr 7, 2011
I'm using fc14 and the SG driver to test some SCSI (SAS) targets. In doing so, I'm bumping up against what appears to be a 512KB maximum transfer size per command. Transfers up to 4MB sometimes work, but often they result in ENOMEM or EINVAL returned from the write() function in the SG driver. I could not find any good documentation on how the SCSI system in Linux works so I've been studying the source for drivers in drivers/scsi.
I see that there is a scsi_device struct that contains a request_queue struct that contains a queue_limits struct that contains an element called max_sectors. The SG driver seems to use this to limit the size of the reserve buffer it is willing to create. I see that there are several constants used to initialize max_sectors to 1024 which would result in the 512KB limit I see (with targets having 512 byte sectors). At this point I have several questions:
1) When the open() function for the sg driver gets called, who initializes the scsi_device struct with the default values?
2) Can I merely change the limits struct to arbitrary values after initialization and cause the SG ioctls to set the reserve buffer to allow greater values?......
View 2 Replies
View Related
Sep 10, 2010
When i change the directory using 'cd', the length of my shell prompt keeps on increasing. To make it more clear kindly see below
Code:
$> cd MyWorks
$ Myworks> cd Shell
[code]....
View 8 Replies
View Related
Jan 6, 2011
How would I list every mp3 over a certain size on an entire hard drive?
View 1 Replies
View Related
Aug 27, 2010
I'm running Fedora 13. On a full screen terminal how do I increase the the size of the script.
View 4 Replies
View Related