Programming :: Efficient Access Of Huge Files Or Defrag Ext4?

Sep 30, 2010

I need to figure out how to arrange for the fastest-possible read-access of a large or huge memory-mapped file. I'm writing high-speed real-time object-chasing software for a NASA telescope (on earth). This software must detect images of fast moving objects (across arbitrary fields of fixed stars), estimate what direction and speed the object image is traveling (based on the length and direction of a streak on the detection image), then chase after the object while capturing new 4Kx4K pixel images every 2~5 seconds, quickly matching its speed and trajectory, then continue to track and capture images until the object vanishes (below horizon, into earth shadow, etc).

I have created two star "catalogs". Both contain the same 1+ billion stars (and other objects), but one is a "master catalog" that contains all known information about each object (128 bytes per object == 143GB) while the other is a "nightly build" that only contains the information necessary to perform the real-time process (32 bytes per object == 36GB) with object positions precisely updated for precession and proper-motion each night. Almost always the information in the "nightly build" catalog will be sufficient for the high-speed (real-time) processes.

[Code]...

View 8 Replies


ADVERTISEMENT

Fedora :: How To Defrag Ext4

Oct 27, 2009

fsck reports 18905 non-contiguous files (20.9%) and 40 non-contiguous directories (0.0%) on an ext4 file system so I would really like to defragment it.

View 7 Replies View Related

Software :: Best Way To Go About Performing Offline Defrag Of Ext4?

Aug 3, 2010

Can someone tell me the best way to go about peforming an offline defrag of ext4? Considerations:Needs to be safe (i.e.: not wanting to bork up my music files 8))I have seen ref. to e4defrag. Everything seems to point to it still being buggy.I have a large ext4 drive with 2/3 empty that gets lots add / remove.would like ti be able to see the amount of fragmentation.would like to make file continous and then condensed to start of partition ( that would make read times faster right?)

View 4 Replies View Related

OpenSUSE Install :: Access Files From Windows From An Ext4 Disk?

Feb 5, 2010

I am dual booting OpenSUSE and Windows 7 Pro x64. Each OS is installed on a separate 1Tb hard drive. One question that I have tried to Google for a solution with no success is, how do you access ext4 from Windows? Shortly after I installed OpenSUSE, my OpenSUSE hard drive "vanished" from Windows 7. aturally, I can access all my hard drives from OpenSUSE, which does support the NTFS. I am quite sure that I am not the only person who has this problem as I know that dual booting Linux and Windows is quite common.

View 8 Replies View Related

Programming :: How To Make Pinging Function More Efficient

Mar 13, 2010

I am starting a project of my own (and learning C++ at the same time. I got my program to successfully scan a custom netmask, but it is REALLY slow. I want my program to do something similar to nmap -sP xxx.xxx.xxx.xxx-xxx. How to speed it up? Such as pinging more than one IP at a time...

Code:
#include <iostream>
#include <cstdio>
#include <cstdlib>
#include <sstream>

using namespace std;
int ip42;
string ip1, ip2, ip3, ip4, ipaddr;
int main() {

cout << endl << "Enter part 1: ";
cin >> ip1;
cout << endl << "Enter part 2: ";
cin >> ip2;
cout << endl << "Enter part 3: ";
cin >> ip3;
cout << endl << "Enter part 4: ";
cin >> ip4;
while(ip42 < 255){
ipaddr = string("ping -c 1 ").append(ip1);
ipaddr = string(ipaddr).append(ip2);
ipaddr = string(ipaddr).append(ip3);
ipaddr = string(ipaddr).append(ip4);
system(ipaddr.c_str());
ip42 = atoi(ip4.c_str());
ip42 += 1;
stringstream output;
output << ip42;
output >> ip4; } }

View 2 Replies View Related

Programming :: Experience To Make A Script More Efficient?

Mar 30, 2010

I wrote this script to attach url's to specified 6 digit numbers in a configuration text file. My original goal was to be able to be able to pull the url's and the 6 digit numbers from .csv files. that would allow me to make the script more versatile, not only for this particular project, but also for other projects in regards to the configuration file. This script works, and has served it's purpose, but it is not very pretty, and it's probably not very efficient. What can I do to improve it and possibly make it more versatile. I've thought about functions and arrays, but my skill set is still pretty limited. I'm not looking for someone to write it for me, just to point me in the right direction.

[Code]...

View 7 Replies View Related

Programming :: Efficient Way To Display JPEG Images In Browsers?

May 12, 2010

What is the efficient way to display jpeg images in browsers using c programming. Any links or any sample code.

View 6 Replies View Related

Ubuntu / Apple :: Access Files From External Ext4 Hard Drive Partition?

May 2, 2010

Recently my laptop broke down and wont start up. I'm currently trying to recover my files to my mac with an IDE to USB cable. It recognized my windows partition fine and I was able to get all my files off of that, but the majority of my stuff is on the ext4 partition that I have on it. Does anyone know how to access the ubuntu partition of this hard drive from my mac?

View 2 Replies View Related

Programming :: HUGE Files - Compare A List Of Patterns From One File And Grep Them Against Another File And Print Out Only The Unique Patterns?

Aug 13, 2010

I am trying to compare a list of patterns from one file and grep them against another file and print out only the unique patterns. Unfortunately these files are so large that they have yet to run to completion. Here's the command that I used:

Code: grep -L -f file_one.txt file_two.txt > output.output Here's some example data:

Code:
>FQ4HLCS01BMR4N
>FQ4HLCS01BZNV6
>FQ4HLCS01B40PB
>FQ4HLCS01BT43K
>FQ4HLCS01CB736
>FQ4HLCS01BU3UM
>FQ4HLCS01BBIFQ

how to increase efficiency or use another command?

View 14 Replies View Related

Ubuntu :: Huge /var/log Files

Jul 7, 2011

So I noticed today that my machines root hard drive had almost no space on it.

I did a disk usage analyzer and I found out my var/log folder is 95GB.

The large logs are:

can I just delete them? also how can I stop this from happening again?

View 1 Replies View Related

Debian :: 8.0 Jessie - Var Log Files Getting Huge

Jan 29, 2016

I installed Debian on my laptop and now I had warning that my log files inside var are getting out of hands...Following files...

36G daemon.log
48G daemon.log.1
41G kern.log
55G kern.log.1
31G messages
42G messages.1
8.2G syslog
17G syslog.1

How can I clear this and set up properly so they don't take so much space?

View 6 Replies View Related

Ubuntu :: Delete HUGE Log Files?

Nov 19, 2010

how can i delete my log files? They are 131.2 GB! And i need space on my pc . And is it ok to delete it ?

View 6 Replies View Related

Slackware :: Huge.s No Header Files ?

Feb 12, 2011

I've started using the huge.s kernel and when i try to compile packages slackware complains about kernel headers but all i see is the smp header files on the slackware discs ?

View 2 Replies View Related

Ubuntu :: HUGE Syslog And Daemon.log Files?

Jul 25, 2010

I have a 60GB partition with / and home on it. I logged on yesterday and it gave me a warning saying that I had only 1.9 GB of disk space left. I ignored it for a day and assumed that i had too many videos and pics.But the next day i had not added any files or downloaded any software but i had 0B left. I used the disk usage analyser and found that 33GBs came from /var/log. It was from two log files. syslog and daemon.log 16.5GB each!! I opened them up and i found that this line of text was repeated hnundreds of thousands of times.


Code:
Jul 22 19:32:36 aulenback-desktop ntfs-3g[5315]: Failed to decompress file: Value too large for defined data type

[code]...

View 3 Replies View Related

Ubuntu :: Moving A Huge Amount Of Files?

Jul 27, 2011

i have about 2 TB of 700mb avi files as data on disc want to spread it across two 2TB ext usb drives (sata 3.5 inside the housing) obviously i have to rip them to the laptop and then move to the ext hdd (omg laborious little task) am i better doing the ripping in meerkat or in a windows machine? files need to be accessible by W7, XP, and meerkat to vlc player. what should i format the discs to?

View 5 Replies View Related

Red Hat / Fedora :: Can't Find Any Huge Files In That Filesystem

Dec 22, 2010

I am facing a strange problem in my server, One of my filesystem shows as 3.1G when I execute df -h command and the utilization shows as 83%, but when I cd to the directory /usr/local I could not find any huge files in that filesystem and I have searched for hidden files as well,

groupserver:~ # df -h
Filesystem Size Used Avail Use% Mounted on
/dev/sda9 3.1G 2.5G 532M 83% /usr/local
groupserver:/usr/local # du -sh *
0 bin
93M abinav

[Code]...

View 2 Replies View Related

Programming :: How To Extract A Subset From A Huge Dataset

Mar 13, 2010

I have a huge file which has 450G. Its format is as below

x1 50020 A 1
x1 50021 B 8
x1 50022 C 9

[code]....

Now, I want to extract a subset from this file. In this subset, column 1 is x10, column 2 is from 600000 to 30000000. I wrote the following perl script but it doesn't work:

#!/usr/bin/perl
$file1 = $ARGV[0]; # Input file
$file2 = $ARGV[1]; # Output file

[code]...

I guess the input file and output file are both too big that my script can't handle it.

View 11 Replies View Related

Ubuntu :: Md5sum - Fast Way To Verify Huge Files

Oct 29, 2010

I'm looking for a fast way to verify a copy of a folder with 150Gigs of data, in 33 files. Some of the files are a few kb, while a few are 20-30Gigs. I've done a file count, which is quick, but doesn't verify that all the files are intact. I tried running md5sum on them, which works, but will probably take as long as copying the files in the first place. Diff works too, but is slow too.

View 1 Replies View Related

Ubuntu Multimedia :: Opening Huge Wave Files?

Jun 24, 2011

I have a wav file bigger than 8GB. i recorded it on a windows PC. unfortunately wav files cant be bigger than 2GB. somehow i got a file that is almost 9GB. I tried to chop the file under ubuntu into smaller pieces to open it part by part. i used gnome split to divide the file and made 10 parts out of it. now i have these parts of the data which i cant read with no program except for gnome split to merge them together again - which would only bring me to the beginning of my problem. so my question is: is there any other way to open/ split&open a wav file of that size or maybe a way to open the splitted file partially?

View 3 Replies View Related

Slackware :: Create The Huge.s Kernel Files On The Disks?

Apr 12, 2010

how to create the huge.s kernel files on the slackware disks? or at least direct me to a post if there is the same question. I currently rsync my files to Alien BOB's script, and i use syslinux to install from my usb stick. i was wanting to install using a later kernel just for testing purposes. (i.e 2.6.34-rc3 as of this writing)

View 9 Replies View Related

Ubuntu :: Determined That A Huge Amount Of Disk Space Is Being Taken Up By Files In /var/log/?

May 18, 2010

I examined the problem, and determined that a huge amount of disk space is being taken up by files in /var/log/ The following files:

Code:
/var/log/messages.1
/var/log/kern.log.1
/var/log/daemon.log.1
/var/log/messages
/var/log/kern.log
/var/log/daemon.log
/var/log/syslog

are all over 1 GB in size. The largest is 18 GB. Together, they total 48.3 GB. I restarted the system, forcing a fsck.

View 3 Replies View Related

Ubuntu :: General Software For Viewing Huge .txt Files (over 10 Megas)

May 19, 2010

Is there any software in Linux to view huge .txt files, say, over 10 megas? I'm now using default "gedit", version 2.28.0, which seems to not be able to open huge .txt files. It's the same case for Windows default .txt browser, but in Windows, "Win Word" seems to work fine. software under Linux to browse huge .txt files?

View 5 Replies View Related

Ubuntu :: /var/log Files Are Huge; Mounting Large Hard Drives?

Oct 9, 2010

syslog, messages and kern.log are incredibly huge files that are taking up a lot of space on my hard drive. Is it safe to remove them and/or to reduce logging so it doesn't take such an enormous amount of hard disk memory? If so, how can I reduce the logging so it doesn't produce logs that are 10s of GB in size?Also, mounting a drive places it into the folder /media. Will it become problematic if the size of the mounted drive exceeds the amount of free space available on my Ubuntu partition?

View 4 Replies View Related

Ubuntu :: 10.4 Making HUGE Log Files--running Out Of Disk Space?

Feb 2, 2011

I started getting errors about running out of disk space in root this morning. I hunted up what's taking all the space; var/log is 39GB (Ubuntu is installed on a 50G partition.) It's specific files that live in that directory, not subfolders. The files are:

kern.log = 11.6 GB
messages (plain text file) = 11.4 GB
kern.log.1 = 6.1 GB

[code]...

View 9 Replies View Related

Ubuntu :: Libreoffice Writer Or Natty Produce Huge Files

May 3, 2011

I do monthly reports by copying the previous document, update the text and change the images. The images are the same size and numbers each months. Since last month I upgraded my laptop to Natty and suddenly my document went from 942 kB to 10.1 MB in .odt. When saving to PDF the usual size of 472 went up to 1.9 MB. I have searched the net and the forums but haven't seen anything about a similar issue.

I'm not sure if it's an issue that is from the previous document being produced in Open Office and now updated and saved in Libreoffice. Or if it's somehow something to do with the upgrade from Maverick to Natty. I would hope I don't have to uninstall Libreoffice and install Open Office as a solution (which I understand is not entirely easy in Natty, something I read about Open Office being transitional to Libre). I can't email simple documents to customers that's over 10 MB large...

View 1 Replies View Related

Ubuntu :: Should Mediatomb Generate 3 Separate Of Huge Log Files With 5gb Of Data

Jul 4, 2011

I streamed video through a my computer with mediatomb yesterday. The problem is that now, I got these huge log files. I am running out of memory (less than 1 gb left) as we speak. They're filled with ufw entries, but my question is:

I read somewhere about a program called logrotate that were supposed to keep logs from getting to big, is this wrong and should mediatomb generate 3 separate log files with 5gb of data each for just 2 hours of streaming?

View 2 Replies View Related

Software :: Need To Transfer Huge Number Of Files - Good FTP Program?

Oct 26, 2009

I need to transfer some massive amount of data (2.5terrabyte, many files, directory structure) to a embedded raid-box which has a minimal linux on it (some custom distro from western digital). We tried rsync (version 2.6.7), but it crashes because the filelist is too big for the ram available (fixed in later versions of rsync, but I don't know how to update, it's not debian based and there are no compiler tools). We tried nfs, but the max bandwidth produced is around 1 mb/sec (cpu bound?), so it'd take around 3 weeks this way. Samba has problems with big files (and we have some 20gb files in there).

SCP isn't installed, and would probably also be cpu bound due to encryption I think. So the only option left would be ftp, we're currently trying using ncftp with the command "put -R /path/to/data/" , but it's been running for over an hour, eating up most of the ram, and not using any bandwidth. I think it is still building a file list or something. FTP already worked for a single 20gb file with acceptable bandwidth of about 12mb/sec. Does anyone know a better ftp program (for console) that can start transferring some data or at least display an estimated time for the copy-preperation?

View 8 Replies View Related

Programming :: HTML / PHP Access To Files And Hanging Directory?

Oct 1, 2010

Code:
<html>
<head>
</head>
<body>

[Code]....

Alright this works fine to pull the directories/files in the /var/Store/2010/ directory.

But when you click on of the links it tries to http://'serveradress'/$filename

note that $filename in the url is the filename clicked on so the php script is working. but I need it to change to that dir so that you can see the folder/files there and work your way up/down/side wise thru the folder tree to where you need to go. Not try and pop it as a direct url which doesn't work.

View 3 Replies View Related

Programming :: Implement User Ranking In Php With A Huge Number Of Users?

Aug 21, 2010

I'm writing a user ranking module for a site. This ranking depends on some criterias and it's possible to set or unset any one of these criterias in order to consider them in calculating the user rank or not. And here's the way I've implemented the ranking calculation :

when I set one or more of the criterias to be considered in ranking , for each user in the system I insert one record for each criteria , for example : if I have 2 criterias and both are set and consider that I have two users , I'll have :

Ranking table
--------------
username | criteria | to_be_added | score
--------------------------------------------------
user1 | criteria1 | 1 | 0

[code]....

It means I just set the to_be_added field to 1 for all of them and leave the calculation of score for each criteria for each user to the time the user logins so that to prevent doing all these calculations at once , because there are a huge number of users ... But there is one problem , if I want to show for example the best user (based on the highest score) , the result can't always be true because some users might not logged in at that time and their score might be zero .

View 1 Replies View Related

Ubuntu Servers :: Deleted Log Files Taking Up Huge Disk Space?

Sep 7, 2010

My /var/ partition continues to fill up on all my servers, and it is because the logs in /var/log/apache2 or /var/log/mysql are being deleted during log rotate, but their file handles are being held open. Thus, a "du -sh /var/log" shows the correct values, but "df | grep /var" shows something much different.

It seems that the log files rotate, however if I run "lsof | grep deleted" it returns lots of files that are no longer visible in the directory, however refuse to clear themselves off the disk.

The only way I have found to make these log files go away (and thus clear up the disk space on the partition I should have) is to restart either apache or mysql, depending on which process has huge sized log files being held open.

Is it just me, or is this a big flaw in the way linux works, that it can't figure out how to release file handle for a log so the disk space can be reclaimed? This is happening to me a lot lately.

Here is some output from one of my web servers so you can see what I am seeing...

root@web49:~# df -h | grep /var$
Filesystem Size Used Avail Use% Mounted on
/dev/sda8 9.2G 6.1G 2.7G 70% /var
root@web49:~# du -sh /var

[Code]....

View 9 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved