Ubuntu :: Should Mediatomb Generate 3 Separate Of Huge Log Files With 5gb Of Data

Jul 4, 2011

I streamed video through a my computer with mediatomb yesterday. The problem is that now, I got these huge log files. I am running out of memory (less than 1 gb left) as we speak. They're filled with ufw entries, but my question is:

I read somewhere about a program called logrotate that were supposed to keep logs from getting to big, is this wrong and should mediatomb generate 3 separate log files with 5gb of data each for just 2 hours of streaming?

View 2 Replies


ADVERTISEMENT

Ubuntu Multimedia :: Mediatomb And Subtitles / Getting Response "Fail" And Mediatomb Will Not Start?

Jan 20, 2010

i've looked at serveral post all over the net but still can't get subtitles (.str files) to bind with the stream on mediatomb.

Mediatomb works fine with out the subtitle transcoding profile, but when added and mediatomb is restarted using sudo /etc/init.d/mediatomb restart I get the response "Fail" and Mediatomb will not start. If I change <transcoding enabled="yes"> to "no" then mediatomb works fine

Below is my config.xml

Quote:

<?xml version="1.0" encoding="UTF-8"?>
<config version="1" xmlns="url . url
xsi:schemaLocation="url 1 url
<!--
Read /usr/share/doc/mediatomb-common/README.gz section 6 for more
code....

View 1 Replies View Related

Ubuntu Multimedia :: MediaTomb Not Playing .mp3 Files?

Feb 23, 2010

I know this has been talked about a lot but im not quite understanding how to handle it. I have mediatomb i turn my ps3 on search for server pick it up I can play all of my .avi files fine but when i go into audio > All audio it just says there are no titles so that the mp3 files show up and i can play them. And as well why does mediatomb not work when the terminal is not open. I open terminal type mediatomb then when i close it says disconnected from media server.

View 2 Replies View Related

Ubuntu Multimedia :: MediaTomb Can See Folders But Not Files?

Mar 14, 2010

I have been using media tomb for a long time now to stream HD movies to the PS3 , its worked perfectly....until now . I use MKV2VOB so the PS3 can play the video and so mediatomb has no transcoding to do. But now when i add the video to the media tomb database the PS3 finds the server, can search through all the folders but when i arrive at the place where the movie should be it says there are no videos. I have been looking about and trying to see what the problem could be. I have checked that <protocolInfo extend="yes"/> is added to the server config file and thats about all i can find on the problem. The thing i don't understand is that i have changed nothing, and that the server must be running as i can search the folders etc.

I am running Ubuntu 9.10 and mediatomb v0.12

View 1 Replies View Related

Ubuntu :: Huge /var/log Files

Jul 7, 2011

So I noticed today that my machines root hard drive had almost no space on it.

I did a disk usage analyzer and I found out my var/log folder is 95GB.

The large logs are:

can I just delete them? also how can I stop this from happening again?

View 1 Replies View Related

Ubuntu :: Delete HUGE Log Files?

Nov 19, 2010

how can i delete my log files? They are 131.2 GB! And i need space on my pc . And is it ok to delete it ?

View 6 Replies View Related

Ubuntu :: HUGE Syslog And Daemon.log Files?

Jul 25, 2010

I have a 60GB partition with / and home on it. I logged on yesterday and it gave me a warning saying that I had only 1.9 GB of disk space left. I ignored it for a day and assumed that i had too many videos and pics.But the next day i had not added any files or downloaded any software but i had 0B left. I used the disk usage analyser and found that 33GBs came from /var/log. It was from two log files. syslog and daemon.log 16.5GB each!! I opened them up and i found that this line of text was repeated hnundreds of thousands of times.


Code:
Jul 22 19:32:36 aulenback-desktop ntfs-3g[5315]: Failed to decompress file: Value too large for defined data type

[code]...

View 3 Replies View Related

Ubuntu :: Moving A Huge Amount Of Files?

Jul 27, 2011

i have about 2 TB of 700mb avi files as data on disc want to spread it across two 2TB ext usb drives (sata 3.5 inside the housing) obviously i have to rip them to the laptop and then move to the ext hdd (omg laborious little task) am i better doing the ripping in meerkat or in a windows machine? files need to be accessible by W7, XP, and meerkat to vlc player. what should i format the discs to?

View 5 Replies View Related

Ubuntu :: Md5sum - Fast Way To Verify Huge Files

Oct 29, 2010

I'm looking for a fast way to verify a copy of a folder with 150Gigs of data, in 33 files. Some of the files are a few kb, while a few are 20-30Gigs. I've done a file count, which is quick, but doesn't verify that all the files are intact. I tried running md5sum on them, which works, but will probably take as long as copying the files in the first place. Diff works too, but is slow too.

View 1 Replies View Related

Ubuntu Multimedia :: Opening Huge Wave Files?

Jun 24, 2011

I have a wav file bigger than 8GB. i recorded it on a windows PC. unfortunately wav files cant be bigger than 2GB. somehow i got a file that is almost 9GB. I tried to chop the file under ubuntu into smaller pieces to open it part by part. i used gnome split to divide the file and made 10 parts out of it. now i have these parts of the data which i cant read with no program except for gnome split to merge them together again - which would only bring me to the beginning of my problem. so my question is: is there any other way to open/ split&open a wav file of that size or maybe a way to open the splitted file partially?

View 3 Replies View Related

Debian :: 8.0 Jessie - Var Log Files Getting Huge

Jan 29, 2016

I installed Debian on my laptop and now I had warning that my log files inside var are getting out of hands...Following files...

36G daemon.log
48G daemon.log.1
41G kern.log
55G kern.log.1
31G messages
42G messages.1
8.2G syslog
17G syslog.1

How can I clear this and set up properly so they don't take so much space?

View 6 Replies View Related

Slackware :: Huge.s No Header Files ?

Feb 12, 2011

I've started using the huge.s kernel and when i try to compile packages slackware complains about kernel headers but all i see is the smp header files on the slackware discs ?

View 2 Replies View Related

Ubuntu :: Determined That A Huge Amount Of Disk Space Is Being Taken Up By Files In /var/log/?

May 18, 2010

I examined the problem, and determined that a huge amount of disk space is being taken up by files in /var/log/ The following files:

Code:
/var/log/messages.1
/var/log/kern.log.1
/var/log/daemon.log.1
/var/log/messages
/var/log/kern.log
/var/log/daemon.log
/var/log/syslog

are all over 1 GB in size. The largest is 18 GB. Together, they total 48.3 GB. I restarted the system, forcing a fsck.

View 3 Replies View Related

Ubuntu :: General Software For Viewing Huge .txt Files (over 10 Megas)

May 19, 2010

Is there any software in Linux to view huge .txt files, say, over 10 megas? I'm now using default "gedit", version 2.28.0, which seems to not be able to open huge .txt files. It's the same case for Windows default .txt browser, but in Windows, "Win Word" seems to work fine. software under Linux to browse huge .txt files?

View 5 Replies View Related

Ubuntu :: /var/log Files Are Huge; Mounting Large Hard Drives?

Oct 9, 2010

syslog, messages and kern.log are incredibly huge files that are taking up a lot of space on my hard drive. Is it safe to remove them and/or to reduce logging so it doesn't take such an enormous amount of hard disk memory? If so, how can I reduce the logging so it doesn't produce logs that are 10s of GB in size?Also, mounting a drive places it into the folder /media. Will it become problematic if the size of the mounted drive exceeds the amount of free space available on my Ubuntu partition?

View 4 Replies View Related

Ubuntu :: 10.4 Making HUGE Log Files--running Out Of Disk Space?

Feb 2, 2011

I started getting errors about running out of disk space in root this morning. I hunted up what's taking all the space; var/log is 39GB (Ubuntu is installed on a 50G partition.) It's specific files that live in that directory, not subfolders. The files are:

kern.log = 11.6 GB
messages (plain text file) = 11.4 GB
kern.log.1 = 6.1 GB

[code]...

View 9 Replies View Related

Ubuntu :: Libreoffice Writer Or Natty Produce Huge Files

May 3, 2011

I do monthly reports by copying the previous document, update the text and change the images. The images are the same size and numbers each months. Since last month I upgraded my laptop to Natty and suddenly my document went from 942 kB to 10.1 MB in .odt. When saving to PDF the usual size of 472 went up to 1.9 MB. I have searched the net and the forums but haven't seen anything about a similar issue.

I'm not sure if it's an issue that is from the previous document being produced in Open Office and now updated and saved in Libreoffice. Or if it's somehow something to do with the upgrade from Maverick to Natty. I would hope I don't have to uninstall Libreoffice and install Open Office as a solution (which I understand is not entirely easy in Natty, something I read about Open Office being transitional to Libre). I can't email simple documents to customers that's over 10 MB large...

View 1 Replies View Related

Red Hat / Fedora :: Can't Find Any Huge Files In That Filesystem

Dec 22, 2010

I am facing a strange problem in my server, One of my filesystem shows as 3.1G when I execute df -h command and the utilization shows as 83%, but when I cd to the directory /usr/local I could not find any huge files in that filesystem and I have searched for hidden files as well,

groupserver:~ # df -h
Filesystem Size Used Avail Use% Mounted on
/dev/sda9 3.1G 2.5G 532M 83% /usr/local
groupserver:/usr/local # du -sh *
0 bin
93M abinav

[Code]...

View 2 Replies View Related

Ubuntu Servers :: Deleted Log Files Taking Up Huge Disk Space?

Sep 7, 2010

My /var/ partition continues to fill up on all my servers, and it is because the logs in /var/log/apache2 or /var/log/mysql are being deleted during log rotate, but their file handles are being held open. Thus, a "du -sh /var/log" shows the correct values, but "df | grep /var" shows something much different.

It seems that the log files rotate, however if I run "lsof | grep deleted" it returns lots of files that are no longer visible in the directory, however refuse to clear themselves off the disk.

The only way I have found to make these log files go away (and thus clear up the disk space on the partition I should have) is to restart either apache or mysql, depending on which process has huge sized log files being held open.

Is it just me, or is this a big flaw in the way linux works, that it can't figure out how to release file handle for a log so the disk space can be reclaimed? This is happening to me a lot lately.

Here is some output from one of my web servers so you can see what I am seeing...

root@web49:~# df -h | grep /var$
Filesystem Size Used Avail Use% Mounted on
/dev/sda8 9.2G 6.1G 2.7G 70% /var
root@web49:~# du -sh /var

[Code]....

View 9 Replies View Related

Ubuntu :: Generate Compressed Files In Zip Format ?

Oct 30, 2010

I was trying to figure out how to generate compressed files in zip format and searched on here. The search produced a list of forum entries on the topic, but all of the instructions were on how to do it in terminal, how to download obscure programs and install them from terminal, then run them from terminal, with all these arcane sets of switches and parameters. Eesh.

It comes with Ubuntu, after all. In the case of zipping files, all you have to do is to go to the File Manager, find the file(s), select it or them, right-click on it or them, and select Compress and file type zip. It's so simple.

There have been a number of other tasks where I wind up spending hours figuring out how to implement the advice offered in these forums through Terminal. The folks who offer the advice often are so good at it that they leave out steps obvious to them, but that take a lot of work for somebody not as skilled at it to find out. After crawling through broken glass to get the job done, and normally screwing something up so it's not quite right once I get it going, I figure out how to do it through the GUI and find out it takes a fraction of the effort.

View 9 Replies View Related

Ubuntu :: Generate A List With Files And Folders?

Feb 2, 2011

if there is an application available to generate a list with files and folders from a location, like a hard drive or a folder? The list could be in any format, even a text file would be just fine.

View 3 Replies View Related

Slackware :: Create The Huge.s Kernel Files On The Disks?

Apr 12, 2010

how to create the huge.s kernel files on the slackware disks? or at least direct me to a post if there is the same question. I currently rsync my files to Alien BOB's script, and i use syslinux to install from my usb stick. i was wanting to install using a later kernel just for testing purposes. (i.e 2.6.34-rc3 as of this writing)

View 9 Replies View Related

Hardware :: Data Storage Locations Of The Separate Devices Involved In Booting Requested

Oct 18, 2010

I'm trying to get a complete overview of booting so I can multiboot. An explanation of the hardware that stores data and the hardware that runs it with the paths the data takes would be awesome!

Here are some quotes that are not comprehensive.

Quote from [url] "When the processor first starts up, it is suffering from amnesia; there is nothing at all in the memory to execute. Of course processor makers know this will happen, so they pre-program the processor to always look at the same place in the system BIOS ROM for the start of the BIOS boot program. This is normally location FFFF0h, right at the end of the system memory. They put it there so that the size of the ROM can be changed without creating compatibility problems. Since there are only 16 bytes left from there to the end of conventional memory, this location just contains a "jump" instruction telling the processor where to go to find the real BIOS startup program."

System Memory is your RAM is it not? Why are they being specific in stating the address location in the Firmware that BIOS uses? An external EEPROM on the board is totally different from RAM is it not? Does the BIOS data travel to a specific RAM Location?

Is there a small processor connected to BIOS or is everything run with the Main CPU?

What exactly is the "chipset" that is referred to with booting?

View 2 Replies View Related

Ubuntu Security :: Encrypt Files Using The Keys - Generate ?

Sep 8, 2010

I recently upgraded to Ubuntu 10.04. I love the passwords and keys application, but was somewhat surprised at the lack of a context menu in gnome to encrypt a file.

In general, I cannot find how to encrypt files using the keys I generate. Maybe I'm missing something? Probably, I just thought since Ubuntu comes with OOB key generation it would have OOB encryption capabilities.

I've read about seahorse and other ways to ADD encryption, I'm just wondering if ubuntu does it natively. It'd be a good idea to add to brainstorms, right click and encrypt.

View 6 Replies View Related

Software :: Need To Transfer Huge Number Of Files - Good FTP Program?

Oct 26, 2009

I need to transfer some massive amount of data (2.5terrabyte, many files, directory structure) to a embedded raid-box which has a minimal linux on it (some custom distro from western digital). We tried rsync (version 2.6.7), but it crashes because the filelist is too big for the ram available (fixed in later versions of rsync, but I don't know how to update, it's not debian based and there are no compiler tools). We tried nfs, but the max bandwidth produced is around 1 mb/sec (cpu bound?), so it'd take around 3 weeks this way. Samba has problems with big files (and we have some 20gb files in there).

SCP isn't installed, and would probably also be cpu bound due to encryption I think. So the only option left would be ftp, we're currently trying using ncftp with the command "put -R /path/to/data/" , but it's been running for over an hour, eating up most of the ram, and not using any bandwidth. I think it is still building a file list or something. FTP already worked for a single 20gb file with acceptable bandwidth of about 12mb/sec. Does anyone know a better ftp program (for console) that can start transferring some data or at least display an estimated time for the copy-preperation?

View 8 Replies View Related

Programming :: Efficient Access Of Huge Files Or Defrag Ext4?

Sep 30, 2010

I need to figure out how to arrange for the fastest-possible read-access of a large or huge memory-mapped file. I'm writing high-speed real-time object-chasing software for a NASA telescope (on earth). This software must detect images of fast moving objects (across arbitrary fields of fixed stars), estimate what direction and speed the object image is traveling (based on the length and direction of a streak on the detection image), then chase after the object while capturing new 4Kx4K pixel images every 2~5 seconds, quickly matching its speed and trajectory, then continue to track and capture images until the object vanishes (below horizon, into earth shadow, etc).

I have created two star "catalogs". Both contain the same 1+ billion stars (and other objects), but one is a "master catalog" that contains all known information about each object (128 bytes per object == 143GB) while the other is a "nightly build" that only contains the information necessary to perform the real-time process (32 bytes per object == 36GB) with object positions precisely updated for precession and proper-motion each night. Almost always the information in the "nightly build" catalog will be sufficient for the high-speed (real-time) processes.

[Code]...

View 8 Replies View Related

General :: How To Generate Report - Sa* And Sar* Files ?

May 29, 2011

I have collected the sa* and sar* files for the past two weeks , i need to generate the report for these files how can i do so i am using centos 5.5. assist me with a tool or a command to do the same.

View 6 Replies View Related

Ubuntu Networking :: How To Generate Default Samba Configuration Files

Sep 1, 2011

I had some troubles on samba, so I re-installed it.After I uninstalled samba, I noticed old /etc/samba folder/ files were left, so I deleted all of them. Then I installed samba, however, no /etc/samba files were installed.How can I generate default samba configuration files??

View 4 Replies View Related

General :: Why Scanning 10 Pages Is Always Creating Huge >45Mb To 110Mb PDF Files

Jan 9, 2011

Please why my scanning is always creating huge 50Mb to 100Mb PDF files ?Each A4 Pnm file is of 6.5Mbytes by resolution of 150.If I decrease the resolution lower than 100, then it starts to be unreadable my text ...

View 11 Replies View Related

Software :: Automatically Generate Text Files ?

Sep 17, 2010

I have a personal server that is password protected. I convert my movies, and make it so that I can watch them through a browser when I want to. I have quite a few movies. They are sorted into different directories. In each directory there is a WebM folder that contains the video file.

Here is what I want to do:Scan all of the directories and subdirectories in /var/www/ for files with extension of webm.

When a file is found, go to the parent directory and create an html file with the same name as the video, only with an html extension instead of webm.

Automatically enter the following html code into the file with the file name matching the file that was found.

Code:

Is it possible to do this with a script? Is there a GUI program that can do this? I don't mind running the script every time after I convert a movie, but if it could monitor the folders that would be nice.

View 11 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved