While saving a file I made a typo and performed the vi command ':w~' instead of ':w!' and I created a root '~' instance in one of my subdirectories. How do I remove this reference without wiping out the entire main /root /~ directory? Do I use unlink()?
I would like to know how can I get permission to subdirectories of a share other than what main share has. I do not want them to have same share I mean for example I share "sharetest" and it has full access for A and B and C groups but "sharetest/foo1" has read only access for A group and "sharetest/foo2" has read only access for B group and "sharetest/foo3" has read only access for all of them.
I have the following command that I run on cygwin:
find /cygdrive/d/tmp/* -maxdepth 0 -mtime -150 -type d | xargs du --max-depth=0 > foldersizesreport.csv
I intended to do the following with this command: for each folder under /d/tmp/ that was modified in last 150 days, check its total size including files within it and report it to file foldersizesreport.csv however that is now not good enough for me, as it turns out inside each
so as you see inside each subfolderX there is a file named somefile.properties inside it there is a property SOMEPROPKEY=3808612800100 (among other properties) this is the time in millisecond, i need to change the command so that instead of -mtime -150 it will include in the whole calculation only subfolderX that has a file inside them somefile.properties where the SOMEPROPKEY=3808612800100 is the time in millisecond in future, if the value SOMEPROPKEY=23948948 is in past then dont at all include the folderin the foldersizesreport.csv because its not relevant to me.so the result report should be looking like:
and if subfolder3 had a SOMEPROPKEY=34243234 (time in ms in past) then it would not be in that csv file.so basically I'm looking for:
find /cygdrive/d/tmp/* -maxdepth 0 -mtime -150 -type d | <only subfolders that have in them property in file SOMEPROPKEY=28374874827 - time in ms in future and not in past | xargs du --max-depth=0 > foldersizesreport.csv
This is, incidentally, the same message that I see while booting. The error message goes away if I comment out the line in fstab starting with /dev/sdc.
I have a machine (mercury) on which /home/hyperhacker/video is a mounted external hard drive while the rest of /home/hyperhacker is on the internal hard disk. I have a second machine (konata) using autofs to automatically mount mercury:/home/hyperhacker in /mnt/mercury as needed. This works, except /mnt/mercury/video shows up empty.mercury:/etc/exports has: Code: /home/hyperhacker konata(ro,subtree_check)/home/hyperhacker/video konata(ro,subtree_check) and I've tried a few variations in konata: Code: $ cat /etc/auto.master
I'm trying to change to a subdirectory: Code: tony@advent:~/scratch$ cd Home-Arch bash: cd: Home-Arch: No such file or directory
So I list the contents of the current directory: Code: tony@advent:~/scratch$ ls duhome Home-Arch qcad_1.dxf qcad_1.svg runme stdout
OK, I assume I have mis-typed the subdirectory name in ways I cannot detect, so I copy the sub-directory name from the output of the 'ls' command, while within the terminal window, and paste it into the next 'cd' command:
Code: tony@advent:~/scratch$ cd Home-Arch bash: cd: Home-Arch: No such file or directory
I browse the directory and sub-directory in Nautlius - everything is there where I expect it to be. The folders/files are not hidden. What is happening here?
I have Joomla installed at /var/www/joomla, if I navigate to http://localhost/joomla, I get a 404 error, but if I add index.php, the page loads. What is wrong?? Also I am not able to navigate to the administration part of Joomla for the same reason, typing in index.php doesn't load it in this case. Yes I have verified that all files are present in the directory.
using this parameters the main html page and all the images will download in the same folder. Instead i would like to have the html page in a folder and all the images,css ecc in a subdirectory for example i want to have:
I have a vary unique problem with file and directory ownership. I need to change the ownership of multiple files and directories under a specific subdirectory.Under this directory structure there are files and directories owned my different users and groups. I need to change all files and directories owned by "user1" to "user2". but if any are owned by "user3" I need those left alone.Is there a simple way to do this or will I need to traverse the structure and change things one at a time.
I've used mod_rewrite to rewrite my apache2 root (my.server.com/) to point to /var/www/drupal6/ instead of /var/www/. I also have a script installed in my cgi-bin (/usr/lib/cgi-bin) but since the rewrite I can't seem to access it. When I go to my.server.come/cgi-bin/test.pl, I get a page not found from my drupal.
Is there a way I can access my cgi-bin without resorting to accessing my drupal through my.server.com/drupal6/?
Background: I have installed drupal CMS which has a base address of my.server.com/drupal6/ as it resided in /var/www/drupal6/. However I wanted users to be able to access it as my.server.com/ so I added the following into my httpd.conf
Code: NameVirtualHost *:80 <Directory /var/www/drupal6> RewriteEngine on RewriteBase /
Okay so I decided to turn my NAS over from Windows Server 2003 to Ubuntu 11.04 Desktop. This was a big leap for me as I'm still quite new to Linux, but I thought it'd be worth it. I've got everything working pretty good such as Apache2, SSH and converting my disks to ext4.My problem however is Samba. I've got it to share out my hard drives okay but haven't got effective permissions just yet. All drives and the documents storage folder have write permissions by everyone at the moment but I'm the only one that really uses them in the LAN. What I need to do is be able to do what can be done on Windows Server, create a share, set who can access it (user or group) and then under Windows Explorer change folder permissions so say Elliot can access the share and folder Stuff but Sandra can access the share and not Stuff. Which is easy enough on Windows but I need the folders under that share to work the same with Samba. Here is how the drives are mounted and where the documents folder is located:
"Samba share name" at "mount/folder" "datastore1" mounted at "/srv/ds1" "datastore2" mounted at "/srv/ds2"
Is there a script which will take the patches subdirectory of the latest Slackware distribution tree, and substitute the new patch txz files for the ones in the slackware subdirectory so that during an ISO install, the latest txz will be used, instead, and the ISO will be minimized by not having the older ones?
If this information excists here so sorry I was not able to find it. How to change permissions in Unbuntu for those people who are trying to change persmissions in a subdirectory.
Open the terminal and then type: Quote: sudo chmod yourpermission number /thenameofyourdirectory
I have used Dump Command to dump the application files. For Full backup the level 0 is working fine. For incremental backup I used the level 1 or 2 it is getting the error as
DUMP: Only level 0 dumps are allowed on a subdirectory DUMP: The ENTIRE dump is aborted.
The code I used =============================== #!/bin/bash #Full Day Backup Script #application folders backup #test is the username now=$(date +"%d-%m-%Y") [Code]...
Near the end of the install, a panel lists dozens of patches, some categorized as "Security", others as "Recommended", each preceded by a checkbox. I started to check all of the patches, but then noticed that the checks were bringing in software I had not requested -- e.g., checking an emacs patch brought in emacs. Rather than bring in all of this additional software, I left all patches unchecked. Now I need to know how I can go back and apply only the patches that pertain to software that I have actually installed.
I'd also like to know whether there is some way to limit the install program panel to relevant patches only.
My laptop is quite narrow and as such I'm finding that Ubuntu's two pannels are taking up too much space for my computing activities.found a suitable method of getting all the relevant stuff onto one panel, much as in the way other distros do?
I have a bunch of MP3 files and I have their paths grouped in a text file. Is it possible to join the relevant MP3 files based on the paths in the text file losslessly?
Old? They've been around for a while and I wonder if they're still relevant. The following:
Code: network.http.pipelining network.http.pipelining.maxrequests network.http.proxy.pipelining And adding a new integer Code: nglayout.initialpaint.delay
Still relevant with the recent versions of Firefox or a waste of time?
Just got a Lexmark Pro200 wireless printer. Can anybody tell me where and how to download the relevant drivers to print using WiFi?. Went to Lexmark.com and downloaded what i thought was the correct driver and the system installed them but cannot find them. When i plug in a usb cable a message tells me that i have connected the correct printer, However when asked to search Lexmark does not even show up on the list to select and search. Was successful on my wife's Windows laptop and after trial and error on my Mac Mini.
How to copy a Read-Only file in Linux and make the copy writable with a single cp command in Linux (Ubuntu 10.04)? The --no-preserve and --preserve seemed to be good candidates, except that they should "and" the mode flags, while what I am looking for is something that will "or" them (add +w mode).
More details: I have to import a repository from GIT to Perforce. I want that all Perforce depot files are Read-Only (that is how Perforce was designed), while all other files that were derived/copied from depot files are writable. Currently if a Makefile tries to copy a Read-Only file then the derived file will also be Read-only. This leads to build-errors when cp tries to overwrite Read-Only file second time. Of course the --force is a workaround here but then the derived file is also Read-Only. Also I do not want to mess with "chmod" after each "cp" command - I will do that only as the last resort.
I have a 160GB harddrive which I installed a F12, would like to upgrade to a bigger drive, but I hate to have to re-install everything.
Recommend a good disk copy utility? The utility should be able to not only copy files, but boot sector and everything. So I just need to make a copy, change my BIOS to boot from the new drive and run everything as before.
just installed ubuntu couple of days back on my netbook. I am still a beginner, enjoying my adventure exploring ubuntu. I have another desktop which runs on XP. I am able to access XP shared folders through my netbook(linux). However, i wanted to copy files from XP infact folders using TERMINAL in my netbook, not copy and paste using my mouse. Are there any commands for it?
I am using windows xp and debian linux.In windows xp I am having around 25 gb offree memory but in linux if i copy anything it says enough space memory to copy
When i do cp filename destinationfolder it isn't happening. I don't get any error messages or any indication that the file copy didn't happen. But when I go to the destination folder and do ls; the file(s) not there. I tried it with sudo also and i get the same results. When I first did the copy it actually copied it somwhere but not where i wanted it. It copied it to folder name Desktop. So i tried copying it from Desktop and again same results.