Slackware :: Huge.s No Header Files ?
Feb 12, 2011I've started using the huge.s kernel and when i try to compile packages slackware complains about kernel headers but all i see is the smp header files on the slackware discs ?
View 2 RepliesI've started using the huge.s kernel and when i try to compile packages slackware complains about kernel headers but all i see is the smp header files on the slackware discs ?
View 2 Replieshow to create the huge.s kernel files on the slackware disks? or at least direct me to a post if there is the same question. I currently rsync my files to Alien BOB's script, and i use syslinux to install from my usb stick. i was wanting to install using a later kernel just for testing purposes. (i.e 2.6.34-rc3 as of this writing)
View 9 Replies View RelatedI am modifying the the xf86-input-summa driver to work with my older Bitpad One digitizer (the Sketchpad uses a slightly different communication, two more bits resolution, etc), but i hve a problem building. I ran configure and it reported no errors, but when I make I get missing header files, like:
xf86Summa.c:33:25: error: xf86Version.h: No such file or directory
xf86Summa.c:122:24: error: xf86Config.h: No such file or directory
xf86Summa.c:124:24: error: atKeynames.h: No such file or directory
[code]....
Using slackware current and I'm really digging KDE 4.4.3. It's been way more stable on my machine than 4.3.x and it performs MUCH better. I think slackware 13.1 is going to be a really good release. Much better than 13.0. Looking forward to upgrading all of my hosts though it will probably take me a few months given how many I manage.
View 3 Replies View RelatedI am a long time (1.something) slackware user and maintainer of a mirror site.I'm suddenly having problems with my favorite distro. (1) I've been mainly running Slackware64. When I was experiencing recent problems on a Slackware (32) 13.1 system, I discovered the huge-smp kernel does not support more than 4G of RAM. This is an obvious bug, and I am shocked that there is no fix out yet. Surely I can't be the only Slackware 32 user with >4G of RAM. I've verified it on a 12G i7 system and an 8G Athlon64-X2 system.
View 4 Replies View RelatedI have installed Slack 13.37 for test on VMware. I installed just all - also all avalible distro. kernel imiges:
Code:
vmlinuz-generic-2.6.37.6
vmlinuz-generic-smp-2.6.37.6-smp
vmlinuz-huge-2.6.37.6
vmlinuz-huge-smp-2.6.37.6-smp
and the default is vmlinuz-huge-smp-2.6.37.6-smp.
Im trying to switch to vmlinuz-generic-2.6.37.6 So, what I did:
fs = ext4
root fs = /dev/sda1
Code:
/boot# mkinitrd -c -k 2.6.37.6 -m ext4 -f ext4 -r /dev/sda1
Edited lilo: pointing new image, lilo command executed. Every time I start new kernel i got this error: [URL]
I'm booting with Slackware 13.1 x64 My MB is GA-X58A-UD5 rev. 2 but I cant boot. My HD which is connected to the SATA3 port 6 and 7 is not getting detected. It is a Marvell SE9128
How can I tell the Huge.s to load the drivers of this controller?
I'm trying to decide which kernel to install in my Slackware 13.37 installation. What is the difference between huge.s and the hugemps.s kernels ? Does one do something the other does not ? I'm installing Slackware because I've read it has no Pulsemedia baked into it. I hope neither kernel has any of that stuff.
View 2 Replies View RelatedDoes anyone notice that Slackware's user/groups is cluttered with things no one will ever need, do you need be to be part of the slocate group? like really? Why doesn't every program just have its own group then, I look at my distro (LFS built), SuSE and some others and I can't help but notice the huge amount of groups in slackware, if you going to add groups for everything then why in the universe do we even need sudo for? Just make the binary you want to work a group and change its ownership..... wtf is Pat and the rest of Slackware doing?
View 14 Replies View RelatedSo I noticed today that my machines root hard drive had almost no space on it.
I did a disk usage analyzer and I found out my var/log folder is 95GB.
The large logs are:
can I just delete them? also how can I stop this from happening again?
When is it good to use separate translation units and object files and link them into the main C program, and when is it good to include the header files in the main C program? I don't understand if most people include header files or if most people just link in object files and use their contents in the main program. It's sort of a simple question, but it's confusing to me and that's why I need help with it. I sort of don't understand the difference, or if there's really no difference other than the way the final result is achieved, which way is better or preferred, etc...
For example:
Code:
or:
Code:
Simple explanation of the difference? or which one is preferred or better? I've read a little on the ELF format... so is there no difference in the end result? It's just a matter of preference or necessity, and where the information is to begin with?
I installed Debian on my laptop and now I had warning that my log files inside var are getting out of hands...Following files...
36G daemon.log
48G daemon.log.1
41G kern.log
55G kern.log.1
31G messages
42G messages.1
8.2G syslog
17G syslog.1
How can I clear this and set up properly so they don't take so much space?
how can i delete my log files? They are 131.2 GB! And i need space on my pc . And is it ok to delete it ?
View 6 Replies View RelatedI have a 60GB partition with / and home on it. I logged on yesterday and it gave me a warning saying that I had only 1.9 GB of disk space left. I ignored it for a day and assumed that i had too many videos and pics.But the next day i had not added any files or downloaded any software but i had 0B left. I used the disk usage analyser and found that 33GBs came from /var/log. It was from two log files. syslog and daemon.log 16.5GB each!! I opened them up and i found that this line of text was repeated hnundreds of thousands of times.
Code:
Jul 22 19:32:36 aulenback-desktop ntfs-3g[5315]: Failed to decompress file: Value too large for defined data type
[code]...
i have about 2 TB of 700mb avi files as data on disc want to spread it across two 2TB ext usb drives (sata 3.5 inside the housing) obviously i have to rip them to the laptop and then move to the ext hdd (omg laborious little task) am i better doing the ripping in meerkat or in a windows machine? files need to be accessible by W7, XP, and meerkat to vlc player. what should i format the discs to?
View 5 Replies View RelatedI am facing a strange problem in my server, One of my filesystem shows as 3.1G when I execute df -h command and the utilization shows as 83%, but when I cd to the directory /usr/local I could not find any huge files in that filesystem and I have searched for hidden files as well,
groupserver:~ # df -h
Filesystem Size Used Avail Use% Mounted on
/dev/sda9 3.1G 2.5G 532M 83% /usr/local
groupserver:/usr/local # du -sh *
0 bin
93M abinav
[Code]...
I'm looking for a fast way to verify a copy of a folder with 150Gigs of data, in 33 files. Some of the files are a few kb, while a few are 20-30Gigs. I've done a file count, which is quick, but doesn't verify that all the files are intact. I tried running md5sum on them, which works, but will probably take as long as copying the files in the first place. Diff works too, but is slow too.
View 1 Replies View RelatedI have a wav file bigger than 8GB. i recorded it on a windows PC. unfortunately wav files cant be bigger than 2GB. somehow i got a file that is almost 9GB. I tried to chop the file under ubuntu into smaller pieces to open it part by part. i used gnome split to divide the file and made 10 parts out of it. now i have these parts of the data which i cant read with no program except for gnome split to merge them together again - which would only bring me to the beginning of my problem. so my question is: is there any other way to open/ split&open a wav file of that size or maybe a way to open the splitted file partially?
View 3 Replies View RelatedI tried to update the header files manually in the include folder, because I thought that I needed to have windows.h etc, but now nothing compiles and I get a good 80 errors for even hello world. I have tried removing g++ and all related packages, deleting the header files folder and reinstalling, but it reinstalls the broken files. Any way of restoring the originals without reinstalling ubuntu?
View 1 Replies View RelatedI examined the problem, and determined that a huge amount of disk space is being taken up by files in /var/log/ The following files:
Code:
/var/log/messages.1
/var/log/kern.log.1
/var/log/daemon.log.1
/var/log/messages
/var/log/kern.log
/var/log/daemon.log
/var/log/syslog
are all over 1 GB in size. The largest is 18 GB. Together, they total 48.3 GB. I restarted the system, forcing a fsck.
Is there any software in Linux to view huge .txt files, say, over 10 megas? I'm now using default "gedit", version 2.28.0, which seems to not be able to open huge .txt files. It's the same case for Windows default .txt browser, but in Windows, "Win Word" seems to work fine. software under Linux to browse huge .txt files?
View 5 Replies View Relatedsyslog, messages and kern.log are incredibly huge files that are taking up a lot of space on my hard drive. Is it safe to remove them and/or to reduce logging so it doesn't take such an enormous amount of hard disk memory? If so, how can I reduce the logging so it doesn't produce logs that are 10s of GB in size?Also, mounting a drive places it into the folder /media. Will it become problematic if the size of the mounted drive exceeds the amount of free space available on my Ubuntu partition?
View 4 Replies View RelatedI started getting errors about running out of disk space in root this morning. I hunted up what's taking all the space; var/log is 39GB (Ubuntu is installed on a 50G partition.) It's specific files that live in that directory, not subfolders. The files are:
kern.log = 11.6 GB
messages (plain text file) = 11.4 GB
kern.log.1 = 6.1 GB
[code]...
I do monthly reports by copying the previous document, update the text and change the images. The images are the same size and numbers each months. Since last month I upgraded my laptop to Natty and suddenly my document went from 942 kB to 10.1 MB in .odt. When saving to PDF the usual size of 472 went up to 1.9 MB. I have searched the net and the forums but haven't seen anything about a similar issue.
I'm not sure if it's an issue that is from the previous document being produced in Open Office and now updated and saved in Libreoffice. Or if it's somehow something to do with the upgrade from Maverick to Natty. I would hope I don't have to uninstall Libreoffice and install Open Office as a solution (which I understand is not entirely easy in Natty, something I read about Open Office being transitional to Libre). I can't email simple documents to customers that's over 10 MB large...
I streamed video through a my computer with mediatomb yesterday. The problem is that now, I got these huge log files. I am running out of memory (less than 1 gb left) as we speak. They're filled with ufw entries, but my question is:
I read somewhere about a program called logrotate that were supposed to keep logs from getting to big, is this wrong and should mediatomb generate 3 separate log files with 5gb of data each for just 2 hours of streaming?
I need to transfer some massive amount of data (2.5terrabyte, many files, directory structure) to a embedded raid-box which has a minimal linux on it (some custom distro from western digital). We tried rsync (version 2.6.7), but it crashes because the filelist is too big for the ram available (fixed in later versions of rsync, but I don't know how to update, it's not debian based and there are no compiler tools). We tried nfs, but the max bandwidth produced is around 1 mb/sec (cpu bound?), so it'd take around 3 weeks this way. Samba has problems with big files (and we have some 20gb files in there).
SCP isn't installed, and would probably also be cpu bound due to encryption I think. So the only option left would be ftp, we're currently trying using ncftp with the command "put -R /path/to/data/" , but it's been running for over an hour, eating up most of the ram, and not using any bandwidth. I think it is still building a file list or something. FTP already worked for a single 20gb file with acceptable bandwidth of about 12mb/sec. Does anyone know a better ftp program (for console) that can start transferring some data or at least display an estimated time for the copy-preperation?
I need to figure out how to arrange for the fastest-possible read-access of a large or huge memory-mapped file. I'm writing high-speed real-time object-chasing software for a NASA telescope (on earth). This software must detect images of fast moving objects (across arbitrary fields of fixed stars), estimate what direction and speed the object image is traveling (based on the length and direction of a streak on the detection image), then chase after the object while capturing new 4Kx4K pixel images every 2~5 seconds, quickly matching its speed and trajectory, then continue to track and capture images until the object vanishes (below horizon, into earth shadow, etc).
I have created two star "catalogs". Both contain the same 1+ billion stars (and other objects), but one is a "master catalog" that contains all known information about each object (128 bytes per object == 143GB) while the other is a "nightly build" that only contains the information necessary to perform the real-time process (32 bytes per object == 36GB) with object positions precisely updated for precession and proper-motion each night. Almost always the information in the "nightly build" catalog will be sufficient for the high-speed (real-time) processes.
[Code]...
I've installed Fedora 12 (KDE version) on VMware workstation running under Windows 7.
I've been able to configure the VMware tools fine up until this point:
Code:
What is the location of the directory of C header files that match your running kernel? After some searching online I deduced I should install all of the latest updates and kernel-devel.
I then nuked the previous VMware tools install and started over with it again, but alas, no directory I try works (i.e. /usr/src/kernels/, /usr/src/kernels/2.6.31.5-127.fc12.i686/ and so forth).
I'm trying to add a header to many .cpp files in a directory (or directory tree).
My attempt was:
Code:
If I put part of the command inside quotes the substitution occurs, but then I get other errors like:
Code:
cat:
No such file or directory
I could just write a small program to do it.
I am new to linux. I want to use a header file asm/msr.h. But the /usr/include does not contain the header file that I want. Can I just copy the whole asm directory into the directory /usr/include, and overwrite the old one?
View 2 Replies View Related