I have a VPS running Debian. The time that it reports from the date command is two minutes fast.Is there a service that I can install that will keep the time correct?
I have dual boot on my comp. Windows XP and Fedora 11 Now in both systems time zone is set to Belgrade ( which is my time zone), but when I setup clock in fedora to be, let's say 16.15h, then when I swich to windows it says time is 14.15h. When I setup in windows on 16.15h, and I swich to fedora, it says time is 18.15h. So I can't get accurate time on both systems in no way.
I have to copy and move files over two systems all the time. So when I am on system 1, I simply use the command
Code: $scp * system2:/some_directory
There are many files in PWD of system1 with different extensions. Of all the files in the PWD on system1, I don't need a file called *residual.dat as it it particularly big and wastes a lot of time copying.How can I make a shortcut so that every time I do scp, it copies everything but the *residual.dat file?
After a disastrous attempt to upgrade Ubuntu 8.04 to 8.10 (system would not boot), I partitioned the disk in two and installed 9.10 on the new partition.
The old data was still available so I copied my old home directory on top of my new one thus:
I have managed to re-install all my software, but a few problems remain.
Here is one of them:
In spite on marking (with Palimpsest Disk Utility) the old partition (with 8.04/8.10 on it) as not bootable, at boot time I get offered three versions of Ubuntu 8.10 as well as 9.10 and the corresponding recovery mode boots etc. Any idea how to get rid of these options without formatting the partition?
I am about to work on getting the workstations on my network here to be connected to the Active Directory on my Windows 2003 server using Likewise Identity Service. of the security requirements is a good time sync, so I am trying to setup my Windows server time server on my CentOS machines. These machines dual boot Windows 7 and CentOS 5.5. I am using the Windows server as a time server and it's getting its time from its CMOS.llowing a microsoft kb article. I removed all the time servers from the CentOS box I am experimenting with and added the IP of my Windows server, it seems to connect ok but theime never gets updated.Oh, and this network has no connection to the internet it's cut off from the world, so sad and lonely and cannot get internet time
I have ubunto desktop 10.04 LTS I installed samba and able to access the share on windows machines. However i want to access the share on 300 windows machine(for example) systems at a time Is it possible.
I have a set of machines on a disconnected network. Periodically, one of the machines connects to the internet and synchronizes its time with a time server that is not known until the connection is established. (The machine queries a central command server for the address of the time server it should synchronize to.)
I then use a custom tool to do some calculations to call adjtimex() and adjust the clock so that it runs fairly accurately.
I know ntpd is supposed to be able to handle disconnected networks but I thought you had to preconfigure the servers in the configuration file.
My intent is to run ntpd on this machine (without configured "server"s) so that it can serve time to the internal network. (Periodic synchronization using ntpdate from the internal machine to the bridge machine.)
The problem: ntpd wants to fuss with the values I set using adjtimex(). I want it to quit thinking it needs to adjust the clock and just serve time to the internal network. (Maybe I have a GPS time source hooked directly to the machine!)
Is there any web-based open source solution(web-based) for Employee's Time and Attendance ?
As i know that TimeTrex is one of the solutions , but is there any other open source software for time ,attendance and payroll other then TimeTrex , so that i can make comparison and select the the one which suits my requirements.
I am running Ubuntu 10.04 on a Intel X25-M PostVille 160 Go SSD drive with ext4. How can I tell if there's something wrong ? What should/can I do to maintain its performance/health ? Should I use TRIM, is Linux recent support for TRIM reliable ?
This may look as a duplicate of this question, but I am more asking in term of good practices and learning how to use this new technology the right way...
Will it be possible to maintain multiple sessions from Ubuntu?Actually we are working on one open source application and we have setup this application on one of our Ubuntu machine and this would act as a server for other developer's windows machine. Is any utility available(preferable open source) using which we will be able to access the Ubuntu server from all 4-5 developer's Windows machines?
Our requirement will be to access the whole system including console and GUI (same like remotely access) by using different sessions so that all 4-5 developers can start work together on the same Ubuntu machine.
I am building an application that would need to store data in many different folders (say, 5,000,000 folders). Obviously that is a big number of folders.
My project is flexible in terms of how to outline the folders (meaning, some folders can be sub-folders of others etc).
Given the Ubuntu file system, is there a rule-of-thumb on how many sub-folders should be in a folder? how many files should be in a folder? (I am asking strictly from a performance point of view).
I just installed suse 11.3 and everything seems to work perfectly out of the box, except for the network card. I am currently using the ath9k driver.
The problem is that if I connect to a WPA protected network, the connection will drop about every 20 seconds and ask for the network key again. Eventually, after about 15 minutes, it completely stops working and I have to reboot in order to be able to use it again. Unloading and reloading the ath9k module has no effect, a full reboot is needed. I have not yet tried to use ath5k or madwifi instead, but on ubuntu, the ath5k and madwifi drivers do not associate with my card.
below is one "cycle" in the nm log, from connected to the next connected:
Code: Jan 6 23:37:59 linux-xyz NetworkManager: <info> (wlan0): supplicant connection state: completed -> group handshake Jan 6 23:37:59 linux-xyz NetworkManager: <info> (wlan0): supplicant connection state: group handshake -> completed Jan 6 23:38:30 linux-xyz NetworkManager: <info> (wlan0): supplicant connection state: completed -> group handshake Jan 6 23:38:30 linux-xyz NetworkManager: <info> (wlan0): supplicant connection state: group handshake -> code....
I have Squid Cache: Version 2.6.STABLE21, installed on RHEL5. as well as configured SARG. It provides downloading detail of squid users. but does not provide uploading detail.I want to maintain uploading record of my users.(e.g which user upload which file/data)
im working at a corporate office with Server: RHEL5.1 Database: oracle in /var now the /var partition is out of space they want to add the fresh hard with with 500GB n want to do lvm in that disk. the /var partition is normal partition before. In this senario wht should i do for it to access the all the data as usual for the users.
I have a system with data stored in multiple disk arrays. I have to come up with a solution that will maintain the disk order of the arrays whenever a stripe fails, is removed and then put back in. One solution I came up with was to stamp every stripe with the disk array it belongs to along with its stripe id. I plan to put this stamp in the last 512 KB of each disk. And I maintain all this information in a sqlite database, that is disk array, stripe id, the software diskname, etc. So that whenever a disk is replaced, its stamp could be read and the corresponding entries in the database are updated.
I recently started to get my large packages (like openoffice, texlive, and the VLC SlackBuild) from Robby and Eric's webpages. Is there an easy way to keep these packages up to date? The only way I know of is manually checking their websites periodically to see if there has been a change in the package version, but I would much rather be notified in some way when there is an update available.
Does anyone have a script or similar system to do this? I ask here because I want to know if other Slackers have already figured out a better method than what I would be able to think of.
Will the above procedure accomplish this objective, without crippling openSUSE ? The second swap partition has never shown any activity (on SUSE). I understand (from Using shared swap files) that a single swap partition may be shared. Since these areas are relatively small, It is not inconvenient to maintain separate swap partitions.
After reading this announcement, I don't understand if Xubuntu 10.04, or others ubuntu 10.04 variants, will be also LTS. What I understand is that :
- Ubuntu 10.04 (Gnome) Desktop Edition is a LTS. - Ubuntu 10.04 Server Edition is a LTS - Ubuntu 10.04 Netbook Edition is NOT a LTS - Kubuntu 10.04 seems to be a LTS - Xubuntu 10.04 - Mythbuntu 10.04 - Ubuntu Studio 10.04
Obviously, I can't understand how some variant could be LTS and not some others : they are all using the same repositories. So, Canonical would maintain updates for only selected packages ?Does anyone knows more about that LTS attribution ?
I have a large number of video files on a couple of high capacity HDDs. The files are reasonably clearly named. I've never bothered creating a database of the files because I am inherently lazy and believe the computer should be doing this for me.
Anyways, crunch time. Had a look at the stuff in the public repositories, but they all involve too much typing, too much work. So, what are others using/ doing to maintain their "collection" indexes?
I'm currently on Ubuntu Server 11.04 x86_64 and have configured a logical volume that contains 2x 2TB HDD's and mounted that volume in /data. The OS is installed to the first HDD (a 500GB one). So the system has 3 HDD's in total (1x 500GB OS disk and 2x 2TB data disks).I want to do a fresh install of Ubuntu Desktop on the system without losing the data in the 4TB logical volume currently mounted in /data. Is this possible and if so, how?
I have made a game for Linux and want to release it soon (on linux & windows). Since its SDL/OpenGL and I dont do any special things it shouldn't be much porting to windows. Problem: Maintaining I have the game in code::blocks SDL project on Linux. So I got wine and installed wined Code::Blocks with MingW so I can cross compile on Linux. For another game I made, I used a Makefile which has "if" statements to set up compile variables. And everything else is totally identical to windows & linux (code, source files and etc...)
With codeblocks I got used to not having to worry about makefiles and it did well and I better focused on making the game rather than everytime a new file is added editing the makefile and etc... Is there some nice ways to have a cross platform environment to make it easy to make games for Linux and windows. I'm thinking of making my own system of auto-generating a makefile (essentially upgrading the setup I have for my previous game to auto add entry's to the makefile & some other stuff).
I need to copy evolution emails to CD to maintain professional records. I need step by step procedures. My magazine instructions do not cover this. I wish to avoid having to print over 250 long emails per month. Policy prohibits storage on third party equipment. Files do not drag to copy Also having difficulty copying any files to disk.
I've been running Fedora Core 3 on a P4 450 as a personal Samba server and domain controller. It's worked so well that I never gave any thought to upgrading. The other night, I noticed that Up To Date wasn't working, and that Firefox was acting strangely. I made the FC 13 installation disks, whereupon I found out that the system didn't have enough memory.
Rather than mess with the P3 450 any more, instead I swapped main boards and decided to do an upgrade. it even possible to do an "upgrade" from 3 to 13? Is it possible to maintain my existing partitions/settings. I've backed up everything that I'd be too unhappy to lose. It's a two drive system and the second is nothing but data, none of it catastrophic to lose, but at least disappointing. I'd like to keep the data and settings on the primary disk, but won't cry if I can't.
I am running Ubuntu 10.4 [64bit] on a AMD dual core with 4gb of RAM. My Problem: I am mounting to a Windows share from my Ubuntu box and everything is working as expected however, when a file is added, deleted, or modified the Ubuntu File browser does not reflect that change until the next refresh. The Windows users with Explorer will reflect the change immediately and automatically.
Is there a way to make the Ubuntu File Browser respond like Explorer when mounting to a Windows share? I call this behavior "maintaining state".