Slackware :: Create The Huge.s Kernel Files On The Disks?
Apr 12, 2010
how to create the huge.s kernel files on the slackware disks? or at least direct me to a post if there is the same question. I currently rsync my files to Alien BOB's script, and i use syslinux to install from my usb stick. i was wanting to install using a later kernel just for testing purposes. (i.e 2.6.34-rc3 as of this writing)
View 9 Replies
ADVERTISEMENT
Jun 22, 2011
Fresh and Full install: Slackware 13.37 64-bit x86_64. What is the correct procedure to switch from huge kernel to 2.6.38.4 kernel?
View 6 Replies
View Related
Nov 15, 2010
I am a long time (1.something) slackware user and maintainer of a mirror site.I'm suddenly having problems with my favorite distro. (1) I've been mainly running Slackware64. When I was experiencing recent problems on a Slackware (32) 13.1 system, I discovered the huge-smp kernel does not support more than 4G of RAM. This is an obvious bug, and I am shocked that there is no fix out yet. Surely I can't be the only Slackware 32 user with >4G of RAM. I've verified it on a 12G i7 system and an 8G Athlon64-X2 system.
View 4 Replies
View Related
Feb 12, 2011
I've started using the huge.s kernel and when i try to compile packages slackware complains about kernel headers but all i see is the smp header files on the slackware discs ?
View 2 Replies
View Related
Mar 2, 2010
I'm trying to install Slackware 64 on a raid 0 fakeraid, I found out that I need kernel support for that. Dm raid. What I did was, I made a virtual machine with no raid compiled the kernel and created a new iso file wich included in the /kernels folder my own kernel too. I burned the image but on boot I got the "image checksum error..." I got confused by all the stuff I've read about linux booting.I need a simple guide how to make a custom made Slackware iso to boot. How do I generate new checksums, how should I write the iso file to make it work?
View 4 Replies
View Related
May 24, 2010
Using slackware current and I'm really digging KDE 4.4.3. It's been way more stable on my machine than 4.3.x and it performs MUCH better. I think slackware 13.1 is going to be a really good release. Much better than 13.0. Looking forward to upgrading all of my hosts though it will probably take me a few months given how many I manage.
View 3 Replies
View Related
Jun 2, 2011
I have installed Slack 13.37 for test on VMware. I installed just all - also all avalible distro. kernel imiges:
Code:
vmlinuz-generic-2.6.37.6
vmlinuz-generic-smp-2.6.37.6-smp
vmlinuz-huge-2.6.37.6
vmlinuz-huge-smp-2.6.37.6-smp
and the default is vmlinuz-huge-smp-2.6.37.6-smp.
Im trying to switch to vmlinuz-generic-2.6.37.6 So, what I did:
fs = ext4
root fs = /dev/sda1
Code:
/boot# mkinitrd -c -k 2.6.37.6 -m ext4 -f ext4 -r /dev/sda1
Edited lilo: pointing new image, lilo command executed. Every time I start new kernel i got this error: [URL]
View 14 Replies
View Related
Sep 7, 2010
I'm booting with Slackware 13.1 x64 My MB is GA-X58A-UD5 rev. 2 but I cant boot. My HD which is connected to the SATA3 port 6 and 7 is not getting detected. It is a Marvell SE9128
How can I tell the Huge.s to load the drivers of this controller?
View 3 Replies
View Related
May 26, 2011
I'm trying to decide which kernel to install in my Slackware 13.37 installation. What is the difference between huge.s and the hugemps.s kernels ? Does one do something the other does not ? I'm installing Slackware because I've read it has no Pulsemedia baked into it. I hope neither kernel has any of that stuff.
View 2 Replies
View Related
Feb 19, 2010
Does anyone notice that Slackware's user/groups is cluttered with things no one will ever need, do you need be to be part of the slocate group? like really? Why doesn't every program just have its own group then, I look at my distro (LFS built), SuSE and some others and I can't help but notice the huge amount of groups in slackware, if you going to add groups for everything then why in the universe do we even need sudo for? Just make the binary you want to work a group and change its ownership..... wtf is Pat and the rest of Slackware doing?
View 14 Replies
View Related
Mar 11, 2010
I need to know what files are contained in kernel headers and where are they?
Are they the same for each kernel or no?
I need slack kernel headers for kernel 2.6.27.27
I just need the headers
View 4 Replies
View Related
Jul 7, 2011
So I noticed today that my machines root hard drive had almost no space on it.
I did a disk usage analyzer and I found out my var/log folder is 95GB.
The large logs are:
can I just delete them? also how can I stop this from happening again?
View 1 Replies
View Related
Jan 29, 2016
I installed Debian on my laptop and now I had warning that my log files inside var are getting out of hands...Following files...
36G daemon.log
48G daemon.log.1
41G kern.log
55G kern.log.1
31G messages
42G messages.1
8.2G syslog
17G syslog.1
How can I clear this and set up properly so they don't take so much space?
View 6 Replies
View Related
Nov 19, 2010
how can i delete my log files? They are 131.2 GB! And i need space on my pc . And is it ok to delete it ?
View 6 Replies
View Related
Mar 12, 2010
I recently did a fresh install of current and once I had it up and running I compiled a fresh 2.6.33 kernel using my old config file, but now I get this warning durning boot, specifically during module loading WARNING: All config files need .conf: /etc/modprobe.d/sound, it will be ignored in a future release. This doesn't seem to be causing any problems but I am curious to know what the message meams? I checked /etc/modprobe but everything looks normal.
View 1 Replies
View Related
Jul 25, 2010
I have a 60GB partition with / and home on it. I logged on yesterday and it gave me a warning saying that I had only 1.9 GB of disk space left. I ignored it for a day and assumed that i had too many videos and pics.But the next day i had not added any files or downloaded any software but i had 0B left. I used the disk usage analyser and found that 33GBs came from /var/log. It was from two log files. syslog and daemon.log 16.5GB each!! I opened them up and i found that this line of text was repeated hnundreds of thousands of times.
Code:
Jul 22 19:32:36 aulenback-desktop ntfs-3g[5315]: Failed to decompress file: Value too large for defined data type
[code]...
View 3 Replies
View Related
Jul 27, 2011
i have about 2 TB of 700mb avi files as data on disc want to spread it across two 2TB ext usb drives (sata 3.5 inside the housing) obviously i have to rip them to the laptop and then move to the ext hdd (omg laborious little task) am i better doing the ripping in meerkat or in a windows machine? files need to be accessible by W7, XP, and meerkat to vlc player. what should i format the discs to?
View 5 Replies
View Related
Dec 22, 2010
I am facing a strange problem in my server, One of my filesystem shows as 3.1G when I execute df -h command and the utilization shows as 83%, but when I cd to the directory /usr/local I could not find any huge files in that filesystem and I have searched for hidden files as well,
groupserver:~ # df -h
Filesystem Size Used Avail Use% Mounted on
/dev/sda9 3.1G 2.5G 532M 83% /usr/local
groupserver:/usr/local # du -sh *
0 bin
93M abinav
[Code]...
View 2 Replies
View Related
Oct 14, 2010
The new 2.6.35.7 kernel fails to boot on my Lenovo laptop. I had previously compiled a 2.6.35 kernel with a couple of different .config files and never had it boot properly. The failure occurs very quickly and I am including the final screenshot in case that helps.
View 7 Replies
View Related
Feb 19, 2011
Let's say that I want Ubuntu, Kubuntu, Xubuntu, Lubuntu, Mythbuntu, Fedora and Julia on one single bootable DVD. How do I do that?
View 1 Replies
View Related
Oct 29, 2010
I'm looking for a fast way to verify a copy of a folder with 150Gigs of data, in 33 files. Some of the files are a few kb, while a few are 20-30Gigs. I've done a file count, which is quick, but doesn't verify that all the files are intact. I tried running md5sum on them, which works, but will probably take as long as copying the files in the first place. Diff works too, but is slow too.
View 1 Replies
View Related
Jun 24, 2011
I have a wav file bigger than 8GB. i recorded it on a windows PC. unfortunately wav files cant be bigger than 2GB. somehow i got a file that is almost 9GB. I tried to chop the file under ubuntu into smaller pieces to open it part by part. i used gnome split to divide the file and made 10 parts out of it. now i have these parts of the data which i cant read with no program except for gnome split to merge them together again - which would only bring me to the beginning of my problem. so my question is: is there any other way to open/ split&open a wav file of that size or maybe a way to open the splitted file partially?
View 3 Replies
View Related
Jun 25, 2010
I've got 3 extra disks on OpenSuse 11.2 - all the same size. I've created a partition on all of them as type 0xFD. If I then try and add raid in yast I get "There are not enough suitable unused devices to create a RAID."
View 9 Replies
View Related
May 18, 2010
I examined the problem, and determined that a huge amount of disk space is being taken up by files in /var/log/ The following files:
Code:
/var/log/messages.1
/var/log/kern.log.1
/var/log/daemon.log.1
/var/log/messages
/var/log/kern.log
/var/log/daemon.log
/var/log/syslog
are all over 1 GB in size. The largest is 18 GB. Together, they total 48.3 GB. I restarted the system, forcing a fsck.
View 3 Replies
View Related
May 19, 2010
Is there any software in Linux to view huge .txt files, say, over 10 megas? I'm now using default "gedit", version 2.28.0, which seems to not be able to open huge .txt files. It's the same case for Windows default .txt browser, but in Windows, "Win Word" seems to work fine. software under Linux to browse huge .txt files?
View 5 Replies
View Related
Oct 9, 2010
syslog, messages and kern.log are incredibly huge files that are taking up a lot of space on my hard drive. Is it safe to remove them and/or to reduce logging so it doesn't take such an enormous amount of hard disk memory? If so, how can I reduce the logging so it doesn't produce logs that are 10s of GB in size?Also, mounting a drive places it into the folder /media. Will it become problematic if the size of the mounted drive exceeds the amount of free space available on my Ubuntu partition?
View 4 Replies
View Related
Feb 2, 2011
I started getting errors about running out of disk space in root this morning. I hunted up what's taking all the space; var/log is 39GB (Ubuntu is installed on a 50G partition.) It's specific files that live in that directory, not subfolders. The files are:
kern.log = 11.6 GB
messages (plain text file) = 11.4 GB
kern.log.1 = 6.1 GB
[code]...
View 9 Replies
View Related
May 3, 2011
I do monthly reports by copying the previous document, update the text and change the images. The images are the same size and numbers each months. Since last month I upgraded my laptop to Natty and suddenly my document went from 942 kB to 10.1 MB in .odt. When saving to PDF the usual size of 472 went up to 1.9 MB. I have searched the net and the forums but haven't seen anything about a similar issue.
I'm not sure if it's an issue that is from the previous document being produced in Open Office and now updated and saved in Libreoffice. Or if it's somehow something to do with the upgrade from Maverick to Natty. I would hope I don't have to uninstall Libreoffice and install Open Office as a solution (which I understand is not entirely easy in Natty, something I read about Open Office being transitional to Libre). I can't email simple documents to customers that's over 10 MB large...
View 1 Replies
View Related
Jul 4, 2011
I streamed video through a my computer with mediatomb yesterday. The problem is that now, I got these huge log files. I am running out of memory (less than 1 gb left) as we speak. They're filled with ufw entries, but my question is:
I read somewhere about a program called logrotate that were supposed to keep logs from getting to big, is this wrong and should mediatomb generate 3 separate log files with 5gb of data each for just 2 hours of streaming?
View 2 Replies
View Related
Oct 26, 2009
I need to transfer some massive amount of data (2.5terrabyte, many files, directory structure) to a embedded raid-box which has a minimal linux on it (some custom distro from western digital). We tried rsync (version 2.6.7), but it crashes because the filelist is too big for the ram available (fixed in later versions of rsync, but I don't know how to update, it's not debian based and there are no compiler tools). We tried nfs, but the max bandwidth produced is around 1 mb/sec (cpu bound?), so it'd take around 3 weeks this way. Samba has problems with big files (and we have some 20gb files in there).
SCP isn't installed, and would probably also be cpu bound due to encryption I think. So the only option left would be ftp, we're currently trying using ncftp with the command "put -R /path/to/data/" , but it's been running for over an hour, eating up most of the ram, and not using any bandwidth. I think it is still building a file list or something. FTP already worked for a single 20gb file with acceptable bandwidth of about 12mb/sec. Does anyone know a better ftp program (for console) that can start transferring some data or at least display an estimated time for the copy-preperation?
View 8 Replies
View Related