Slackware :: Is The Next 32B - 14 - Going To Have Large Memory Capability Enabled?
Jan 4, 2011
Just wondering now I see computers everywhere with 6G of ram or more if the new 32b kernels are going to be setup for large amounts of ram or will they still need to be recompiled? Also will there be better ssd support for things like "trim"? I know you edit the fstab to get the Linux version of trim but I am just wondering if stuff like that will just be automatic or if I will continue to need to tweak. I like to tweak but sometimes its the lack of need for tweaking that makes me like slack.
I want to take out the multilib capability for my system. Is it as simple as using "slackpkg clean-system", to remove all the compat32 pkgs, then using "slackpkg upgrade-all" to replace all the gcc and glibc packages?The reason I ask, is that I only used the multilib to be able to run Google Earth. There are no other x86 packages on the system. I have another x86 computer that will run Google Earth if I need it. It just wasn't worth it for this one package.
I want to transfer an arbitrarily large file (say >20GB) between 2 servers. I have several considerations:
Must use port 22 (ssh) because of firewall restrictions Cannot tax the CPU (production server) Memory efficiency Would prefer a checksum check but that could be done manually Time is not of the essence
Server A and Server B are on the same private network (sharing a switch) and data security is not a concern, Server A and Server B are not on the same network and transfer will be via the public internet so data security is a concern, My first thought was using nice on an scp command with a non-CPU-intensive cypher (blowfish?). But I thought I'll refer to the SU community for recommendations.
I used 9.04 for months and it work fine before restarting my PC. After I restarted my PC, the memory consumption takes up to 4.2 GB after login. However, I cannot find any process that consume such large number of memory.
I am trying to understand a large amount of allocated memory that seems not to be accounted for on my system.I'll say up front that I am discussing memory usage without cache and buffers, 'cause I know that misunderstanding comes up a lot.I am in a KDE 4.3 desktop (Kubuntu 9.10), using a number of java apps like Eclipse that tend to eat up a lot of memory.after a few days, even if I quit most apps, 1 gb of ram remains allocated (out of 2 gb).this appeared excessive, and I took the time to add up all values of the RES column in htop (for all users).the result was about 1/2 gb.am I trying to match the wrong values?or could some memory be allocated and not show up in the process list?this is the output of free
Code: total used free shared buffers cached Mem: 2055456 1940264 115192 0 123864 702900
I'm doing a clean install on a new machine, and when I tried to boot the install DVD, ISOLinux gave me the following error:
Code: Could not find kernel image: linux After some experimentation, I found that switching the SATA mode in the BIOS from AHCI to Native IDE solved the problem and installation is proceeding normally.
However, I want AHCI enabled. (I think, unless someone has a compelling reason why I shouldn't.) My web-searching suggests that if the system is installed in IDE mode, it won't boot if I switch to AHCI afterwards. Is this a common problem? Is there a workaround? Is it a motherboard issue?
Today I stumbled upon Magic_SysRq_key. Basically, it's allows the user to perform various low level commands regardless of the system's statee.g.Alt+SysRq+b will reboot the system without syncing or unmounting.(SysRq is Print on most systems.)I see the point the use for kernel hackers and such, which compile their own kernels anyway, but why is this enabled by default?
i don't know if this is a slackware related issue but i have the following problem.I'm running a slackware64-current on my system. For my private data I'm using a QNAP NAS (Some ARM CPU with linux kernel 2.6.22), the file shares provided by NFS. I mount them withmount -t nfs 192.168.0.2:/Public /mnt/qnapWorks fine, no problems.But now, if i try to copy some large files ( > 1GiB) to the NAS share, sometimes the systems completely freezes during the copy process. I have to do a hard reset to bring the system back to work
I've been playing around with browsers today and just for kicks I'm trying to figure out how I get webkit enabled in konqueor. Has anyone got this to work? I've done a lot of googling, but haven't gotten anything to work yet.
I updated to latest -current last night and since then have not been able to login to my KDE desktop. After typing in the password, I get the starter icons come up, and just as the last one comes up, the desktop logs out to the login screen again. I worked out that it was because desktop effects were enabled - disabling these in kwinrc allows me to login, although without desktop effects.Using NVidia 256.53 binary drivers b.t.w.
Got tired of long waits for fsck on very large partitions.Here's a script to fsck selected partitions every 'N' shutdowns. No more boot delays for fsck (unless something is really wrong
Update1: On my system '/usr/libexec/gam_server' (gamin component used by xfce) prevented /home from being unmounted. I changed Code:
After installing slackware 13.1 I start up amarok and when I go in and configure the settings and it starts to scan the folder and it either hangs at 10%, stops responding all together or crashes, the library is about 130 gigs of mp3s. I do not know where to start on this one. Amarok version 2.3.0
My system: Slackware 13.0, 512MB RAM, x86 This is the webcam I'm trying to get working:
Quote:
Originally Posted by lsusb 2460 Pixart Imaging, Inc. Q-TEC WEBCAM 100
When I plug the webcam in (usb), the led starts to shine, indicating that it is filming. When I use a program (XSane for example) and click the 'scan' button, the led turns off! It seems that the cam works when it shouldn't, and vice versa.
Been thinking about changing over to 64 bit, but I was just curious about whether or not I'll have to worry about incompatibility with a few 32bit-only applications I use, once I've set up multilib. I'll test it out in vbox when I get home but I wanted to check here to see if I could maybe get a solid yes/no/maybe answer.
Reading and writing works absolutely fine with small files but large files are tediously slow in writing to the server. (rw,no_subtree_check) are options in exported directories.
What is your experience with NFS and how can I speed up large file/folder transfer(write) speeds?
I am currently running slackware 12.2 on a 25 GB partition. I like to use slackbuilds, but when I try to compile larger tarballs (like abiword, or a patched version of Ghostscript as I did today) I receive an error-message: 'Not enough space left on device'. I think the size of the partition must be big enough (I never got this message when compiling with Linux From Scratch). I think it has something to do with the size of my /tmp directory, but I don't know how to fix this. Is there a way to solve the problem, so that I could be able to use slackbuild-scripts?
I have a system with 2G of memory and swap memory of 4G.
This is the output from :
PHP Code:
How could they do to the memory cache to be used as much? Because, occasionally, swap is used and note that the system could use the memory cache does not swap ...
I'm running 13.1 on a Dell Latitude D630 with 4GB of RAM. My problem is that Slackware doesn't seem to see all of the available memory. Here are the #s being reported (just including pertinent #s rather than all output)...
When I run ardour sound editing I get this message , but it starts ok Your system has a limit for maximum amount of locked memory! This might cause Ardour to run out of memory before your system runs out of memory. You can view the memory limit with 'ulimit -l', and it is normally controlled by /etc/security/limits.conf
bash-4.1$ ulimit -l 64 my limits.conf is like this audio - rtprio 99 @audio - memlock 250000
how big and widespaced the fonts on Clementine playlist are and how good they look on the appmenu (where my mouse pointer is). This is not because Clementine is QT4, I've got the same problem with Chrome, Opera etc. I've been messing with system-settings (KDE settings tool) a day before the fonts become that widespaced in order to make my KDE apps look more native on my GNOME, but I haven't touched the fonts settings there.
2 sets of skype logs on work PC (Windows) - and I forgot to take a copy when I finished the assignment - and on my own Slackware PC. Would be very useful to be able to have only one set (and same for instant message logs) on my usb memory stick. Is this possible?
I am not understand how does capability work, I have tried use dac_override what be explained override any dac constrain, but when I try to use some root privileges such as: ls /root, it still be negatived by priority problem. and which situation use profile transition. how to use these attribute?
I used mkinitrd to build an initrd from the slackware 2.6.37.6 sources. Lilo throws the following: "Warning: The initial RAM disk is too big to fit between the kernel a the 15M-16M memory hole. It will be loaded in the highest memory as though the configuration file specified "large-memory" and it will be assumed that the BIOS supports memory moves above 16M."
Also, I am running swap, / and home on an encrypted volume group. When the initrd boots (but prior to mounting the encrypted vg) I get a message saying that no modules are found-sounds like a daft question but is this expected? I expect that this is because initrd is looking for modules, but can't find them because the relevant partition isn't mounted.
I tried the rescue mode of the install DVD. It doesn't have any package management initialized. What I want to do is have a LiveCD environment where I can install an RPM package into that live running system (not persistently ... but pretending so in RAM).
Ubuntu can do this. If I choose "Try Ubuntu" in the boot menu selection, I get a live system that has the DEB package manager ready to go. It can install more packages that depend on packages already installed.
I just want the same but having Fedora. If 16 GB of RAM is not enough, I can get more.
I am looking to setup some computers as thinclients to connect to my MS Terminal Servers. We have a homebrew application that configures RDP sessions. A user logs into a webpage that dynamically generates a "launch.rdp" file. This file is generated to balance the load between servers.What I'd like to do is configure an image to boot up into Mozilla. When the user goes to the webpage, I'd like the launch.rdp file to open up in a terminal server session.I've tried with rdesktop and tsclient, but unfortunately neither would work with a "rdp" file.
I am using UBUNTU 9.10 64 bit the other day my machine acted up, I had to restart it 5-6 times to start and now I have NO download capability I have tried YAHOO and GOOGLE to no avail
I have found EDIT-PREFERENCE-OVERALL-DOWNLOAD but NOTHING works.
I have a dual boot -Windows7 and Ubuntu 10.04.I have no problems connecting to my modem/router with Windows7. Wireless works just fine.With Ubuntu 10.04, after many months with no problems, I suddenly lost capability of connecting to the modem/router. No hot spots show up as they previously did. I have my own wireless setup as "hidden". When I try to connect to it, I get an info box stating "Wireless Network disconnected..