General :: Limit On File Size - Doesn't Allow User To Create File Which Are Greater Than 100Kb
Jun 13, 2011
I was just testing specifying limit on file size to a user and have added the following to /etc/security/limits.conf bob soft fsize 100 This basically should have said not to allow bob to create anyfile greater than 100Kb in size.
But the interesting thing is, if bob already has any file which is greater than 100Kb in size, it even doesn't allow to log him into the system both from console and SSH. Also nothing is logged in logs.. How do I configure it so that, bob can login to the system even though he has any file greater than 100Kb (but doesn't allow him to create file which are greater than 100Kb) ??
View 3 Replies
ADVERTISEMENT
May 16, 2011
Last weekend i have increased the open file size (ulimit -n) for the application user id i have update the limits.conf file with necessary inputs restarted the service and the server as well, when i check the ulimit value for the specific user by switching user from other user it shows the new value (10240) but if i login directly using the application id the ulimit value shows as 1024 which one is the default one.
View 6 Replies
View Related
Jan 17, 2011
Based on some web reference, apache access log file size increased more than 2Gb. It will effect apache performances. Is this correct?
View 4 Replies
View Related
May 11, 2010
I have a large file (deflated size: 602191947)that is not saved in my Ubuntu One account. On sync'ing the file is being uploaded, and eventually reaches 602191947 - and then nothing more happens to this file - but sync'ing the following files in the queue goes on with success. I have tried manual upload with the same result. The file is still being marked as 'uploading' even after several tries and log ins/log outs, and reboots. So I was just wondering whether there is a file size limit - can't seem to find information regarding this.
View 5 Replies
View Related
Oct 20, 2010
a possibly preposterous question. I am aware that you can designate a swap file or swap partition on your hard drive that linux uses as "memory". Suggested sizes for the swap file that I've seen range up to about 1024MB. Is there a limit to the swap file size that you can set?Basically I am running a perl script that processes a massive B) file (DNA sequence data), etc, and requires around 48 GB of memory to run, maybe a bit less. So, would it be possible to set a swap file to a massive, ridiculous size (~60GB oratever) and successfully run such a script on a desktop?Yes, I am aware that it would massively ow down the process. The thing is, if the perl script normally completes in about half an hour, and I can get it working on a desktop, I don't mind if it takes days or weeks to complete. I really don't. That's because it takes days or weeks to get access to a computer with the required grunt to do it.So, is this a stupid idea? Is it even possible? If so, given a perl script that normally completes in a half hour on a 48G system, if you do this, would it take days? weeks? decades
View 7 Replies
View Related
Nov 5, 2010
I've noticed that for files longer than about 8000 lines that gedit has problems opening the file. Was gedit not designed for long files or is there another problem? The same thing also happens on complicated html files. So I hope there is a way to fix this.
View 4 Replies
View Related
Jan 4, 2010
I have a self-made application running on a small embedded Linux device (which should not matter) using syslog to output some error, warning or debug logs.There is a "better" syslog daemon installed, called syslog-ng, which have some more features,t I miss a very important one:How to limit the size of the logfiles to some dedicated megabytes. I was able to create rotating logfiles with the configuration in syslog-ng.conf:
Code:
destination testlog {
file("/var/log/test/log-$S_WEEKDAY"
[code]...
View 2 Replies
View Related
Dec 16, 2010
I have a single 6.2Gb file that needs to go on a fat32 format hdd, does anyone know of a way to split the file so it will fit.
View 2 Replies
View Related
Apr 21, 2010
Does Recordmydesktop have a file size limit? I'm considering using the Zero compression setting to keep CPU usage down, but I don't want to run up against a 2GB or 4GB file size limit. While I know some filesystems impose this limit, most screen recorders I've used have a 2GB or 4GB limit when recording, regardless of the filesystem.Is this an issue with Recordmydesktop
View 1 Replies
View Related
May 7, 2010
I'm trying to copy a 7.8GB tar.gz file to an external hard drive via command line. It gets to an even 4GB and stops, and gives an error that says "file size limit exceeded." I edited some file at /etc/security/limits.conf to look like: "root hard fsize 10024000" but that didn't do anything at all. Yes, I am copying this as root.
View 9 Replies
View Related
Jun 22, 2011
Using getrlimit I am setting the core file size to be RLIM_INFINITY. But still the core file is not being generated,although in /var/log/messages it says a core is being generated
View 3 Replies
View Related
Feb 23, 2009
I'm researching about symbolic links been used with samba / CIFS:I'd like that the user that uses a MS-Windows OS could see my shared folder on CentOS 5 and the symbolic links that are inside this folder. Well, it works but, the user will see that the size of the file is bigger than the real file. Apparently, CIFS gets the size of the symbolic link (aproxim.32K) and add it to the size of the file.Example 1: 100KB file, used with shared folder, MS-Windows's user will see 100KBExample 2: 100KB file, used with symbolic link inside a shared folder, MS-Windows's user will see 132KB. (Sym link + size of file)Is there a way to allow the user only see the size of the file, and not the file + symbolic links ?
View 1 Replies
View Related
Jun 7, 2010
I have a command line server that logs to stdout, which I start along the lines of ./server > log.txt
What I want to do is limit the size of log.txt, without modifying the server.
I am assuming there must be some kind of tool already that lets me do this, something like where I can pass in my server, the output file and a size limit? If so, can anyone enlighten me?
View 3 Replies
View Related
Apr 12, 2010
I downloaded pdftk 1.41 fromand installed on Red Hat Enterprise Linux 4, 32 bitI am primarily using this utility to uncompress pdf files to remove the 'Flate' compressionIt works good with small pdfsHowever, when i use to uncompress pdf files of size 35MB or more, the uncompressed output file grows up to 2GB and then the uncompression fails with error:"File size limit exceeded"I can concatenate two files with output file size upto 3GB in size, so 2GB is not the limitation at the linux level
View 1 Replies
View Related
Feb 2, 2010
I want to add 50 new users, not on the server yet I want to add them all to group Accounting - with 1 option, not user by user I want to setup a default password for them all, and have it say something like 'You must now change password or no access will be permitted' Any other options I also want to do once, not for each user?
View 3 Replies
View Related
Dec 7, 2009
Fedora 12 gcc 4.4.1 I am doing some programming, and my program gave me a stack dump. However, there is no core file for me to examine.
So I did:
Code:
ulimit -c unlimited
and got this error message:
Code:
bash: ulimit: core file size: cannot modify limit: Operation not permitted I also tried setting ulimit to 50000 and still got the same error. The results of ulimit -a:
Code:
$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
[Code]...
View 3 Replies
View Related
May 24, 2010
I want to create a logon script (or somesuch) that creates a file (if it doesn't already exist) and checks the file for some info otherwise. If it finds a given trigger in that file, it logs into a local database and does some operations.
Now my problem isn't with creating that file or even getting it to function as a logon script -- it's with permissions. After the logon script creates the file, I want that user to have read access on it ONLY. Further, I don't want to give the user any kind of root access so that they could do the database operations in question or chown/chmod the file.
What's the best practice here? I'm noticing that whenever the script runs (in .bashrc right now) the script runs with the current user's permissions. Ideally, I'd like to make it so the login script can run at a higher level of permissions, (higher than the user has). Is this even possible? What's the best way to do this?
View 1 Replies
View Related
May 4, 2011
Is possible to make a folder permission like below?
-User can create files/folders in the shared folder.
-But the files/folders they created, cannot be delete/change by em.
(only can be delete by root users)
-Each new files/folders created will auto owner to root only.
View 4 Replies
View Related
Apr 15, 2009
I've been looking for this feature for months and couldn't find a solution for this. Does anyone know how to create users and limit the user to a specified directory?
View 6 Replies
View Related
Sep 13, 2010
How can i limit user to their mailbox in specific size.
View 2 Replies
View Related
Jan 21, 2011
i want to Restrict a particular user from creating a file beyond a prticular size.ie he should not be able to create a prticular size [say 10mb] but he can use upto 10 gb.[ not the quota space i mean]
View 6 Replies
View Related
Feb 17, 2011
In trying to solve a friend's lack of foresight, i have currently disabled my system.
I was using dd_rescue to make a copy of a drive with a corrupt and unfixable Partition Table. I was a fool, and had a drive mounted to /media/Storage, but ran the backup to /media/storage.
Thus, dd_rescue completely filled my primary drive before informing me that there was a problem.
I don't really trust myself with command line work, so I foolishly sudo'ed nautilus and deleted the folder /media/storage.
Unfortunately, I didn't realize it, but the available space on the drive still read 0bytes.
I tried Terminal work to do a sudo apt-get clean command, but for some inane reason, the laptop screen won't support the display setting for the Terminal login, so I just had to hope that I was doing it right.
I wasn't, and decided to try working from a Live CD so I could see what I was doing.
the folder /root/.Trash/ doesn't exist on Ubuntu's install drive, and I can't figure out why the properties of the drive say "contents: 241310 files, 3.7 GB" but also "Total capacity: 52.8 GB. Free space: 0 bytes"
Any suggestions on how I can get this to shake out?
View 3 Replies
View Related
Jun 10, 2010
Are there software that can split big file size into small file size in Linux?
View 1 Replies
View Related
Oct 15, 2010
I have BackInTime backing up my computer to a RAID cluster. The problem is that BackInTime doesn't have an option to limit disk space used. I also use this drive as a fileserver, and need to be able to keep some space open for that.
Is there a way that I can limit the amount of space a specific folder can take up? Alternately, is it possible to create a disk image that will only take up the amount of space in the image, but can automatically expand to a certain size? It would work similar to the Mac SpaseBundle format.
View 7 Replies
View Related
Mar 2, 2011
I have 2 directories in my home folder that I would like to set a size limit on. The directories are ~/backup and ~/temp. Is there an easy way to limit the size of a directory without having to make partitions?
View 4 Replies
View Related
May 8, 2010
I have a two hours long home video with I edited in a video editor program. I'd like to burn it to DVD, but first I need to export it to an mpeg file. Cinelerra doesn't allow me to render to mpeg, instead it offers .avi or .dv - problem is that the resulting file size is enormous. (i.e. 1 minute of .avi = 1.2GB! or around 500MB when I output to .dv) What file format would be best to render to and at the same time not to get an insanely big file? I'd like to keep it under 1GB if possible.
View 2 Replies
View Related
Apr 7, 2011
1. How sum of system time and user time can be greater than real time ?
2. Even though my program is not waiting for any I/O the real time is smaller than system time as shown
root@chaitu:/home/chaitu/Desktop/Chk# time ./new
real0m0.001s
user0m0.000s
sys0m0.004s
View 4 Replies
View Related
Jul 30, 2011
i have VPS server and i installed Xserver on it and all ok i created new user for my client but i need to limit his access to the following
he can download and upload to his home file " browser by Firefox"
he can't install or use any application "just the one i installed it"
he can't see the file system or browser it !! if i can give him specific space on harddisk would be better
he can extract and compress files
he can't edit the settings ....
i have another sensitive folder and setting i don't want him to see it so how to limit his access?
View 8 Replies
View Related
Jul 26, 2010
I am testing my ftp server configuration.Anonymous download works , however anonymous upload does not.I am getting the following error message from both Windows and Linux 5.4clients : 553 cannot create the file.And i am running Fedore 12.
View 3 Replies
View Related
Oct 14, 2009
is there a way to save/create a file with fopen so the file is in the user home directory.Normally I'd do fopen("/home/me/myfile.... but me might change from one user to another. So can I so some sort of switch so it saves to whoever is using.
View 5 Replies
View Related