Server :: UShare - Directory Recursion And Symbolic Links / "drill Down" Directory Structure To List Files?
Jan 22, 2010
I have uShare 1.1a setup to talk to my XBox 360. If I share a directory that has no subdirectories, the video files display on the XBox. However, most of my files are in sub-directories on a different partition - I don't really want to copy them to the share, but uShare doesn't seem to recognise any sub-directories or files contained therein.
I have tried setting up symbolic soft links directly to the video files (although this is a pain, it is better than moving the files)...
Code:
ln -s /home/jonftp/TV-Shows/Buffy/Season-1/Buffy-101.avi /home/share/Buffy-101.avi
...but these don't show up on the XBox either.
How can I get uShare to "drill down" the directory structure to list the files or how can I get uShare to follow symbolic links?
I'm using FC10 and I want to create a symlink to my movies directory in my home folder:
This is what I did: I created in /var/www/html ln -s /home/username/movies movies
Then in /etc/httpd/conf/httpd.conf DocumentRoot "/var/www/html" <Directory /> Options FollowSymLinks AllowOverride None </Directory>
<Directory "/var/www/html"> Options Indexes FollowSymLinks AllowOverride None Order allow,deny Allow from all </Directory>
<Directory "/home/username/movies"> Options Indexes FollowSymLinks Order allow,deny Allow from all </Directory>
Restart apache and then the test page is working.
The directory /home/username/movies has following permissions: drwxrwxrwx 2 apache apache 4096 2009-03-05 23:43 movies When trying to access my webpage at localhost/movies I get the 403 Forbidden Error. Ok then, entering: sudo -u apache ls /var/www/html > movies This works, sudo -u /var/www/html/movies returns the permission denied error. As well sudo -u /home/username/movies Is the user apache chrooted by default? SELinux is in permissive mode. What can I do?
and my current working directory is sub1link, is there a quick way to either: change directory to link source parent (i.e something similar to cd .. but take the user to /dir1/ change directory to link source (i.e switch from /dir2/sub1link/ straight to /dir1/sub1
I am trying to make my Apache server show symbolic links in a directory listing, but have so far been unsuccessful. In my latest attempt, I have placed the following code in .htaccess, in the directory with the symlinks that I want listing:
Code: <Directory /> Options All </Directory> Im httpd-vhosts.conf, I have also placed the following code within the relative <VirtualHost></VirtualHost>:
When I run "ls -al somedir*" (I use the "ll" shortcut, actually), Linux not only list files that match, but also the contents of directories whose name also happens to match.Is there a way to limit "ls" so that it will only show names (files and directories) and ignore the contents of the directories?
How do I download all the files form here: [URL]. I am on freeBSD 7.0 and I tried wget with the -r switch and it gives me URL's only. Maybe this is simply not an ftp site I don't know. How I can download all those files with the same directory structure.
I have a drive with an NTFS partition where all the files were deleted. What I'm looking for is a way to rebuild the directory structure and recover the files. I really, really want the directory structure as the partition contains 460 Gigs of data. Normally I would use the tools here: [URL] but I've never dealt with this much data before. Everything there that I've used creates a pretty messy dump however.
I have used ntfsundelete before but only for a few files at a time. I have no idea what would happen if I tried to run it on a partition of that size. I'm comfortable with data recovery but this amount of data is beyond me. I've run ntfsundelete with no args and from what I can tell of skimming the pages of output all the files are fine. The partition has not been written to.
What is the best and simplest way to compare two directory structures without actually comparing the data in files. This works fine: diff -qr dir1 dir2 But it's really slow because it's comparing files too. Is there a switch for diff or another simple cli tool to do this?
I have a file that I am attempting to use to restore a website for a client. The file is a drupal folder structure where all of the files in the folder (3000+) have been individually compressed into 3000+ individual gz files.
I have figured out how to uncompress all the files using WinRAR but it puts all the files into a single directory, i need the original folder structure maintained.
I'm using OpenSSH 5.5p1 on Fedora 15. I'm trying to get a chrootDirectory to work. Specifically trying to figure out why I can't write files to a sub-directory of the chroot directory. I created a user test_user and created a group called sftp. I added test_user to the sftp group. I edited /etc/ssh/sshd_config as follows:
Code:
Subsystem sftp internal-sftp Match group sftp ChrootDirectory /home/sftp_users/%u X11Forwarding no
I want to make symbolic links for all them to my current directory /test2
I tried
But it failed. It seems like I can't make symbolic links for all the 5 files simultaneously.
Often times I need make symbolic links for multiple files with some common pattern (just like ".txt" here). I really hope to avoid making symbolic link for each of them one by one...
How can we list only files present in a directory in Redhat linux.The LS commands lists both the files and the directories. What command can be used for the above purpose.
When I try to list files in directory. I am getting i/o error #ls -l /test I am getting i/o error. Why I am getting this error and what are these i/o errors.
I'm trying to write a bash script that gets the list of files in a directory and puts them into a variable, then checks each entry and outputs them as follows:
item1 is a FILE item2 is a DIR item3 is a DIR etc etc.
I am able to get the list of files into a variable, but unsure how to get the output I want.
How would i go about copying files to a directory, yet skip the files that already exist in the directory, and also remove the files that are in the directory. For example:
Code:
$ls /dir1 img001.jpg img002.jpg
[code]....
Now i would like to copy from dir1 to dir2, but the contents of dir2 would be:
I've the following file structure that I would like to add to git.
Code:
These are big directories and I don't need them all checked out. I only need the src directory. After I commit the files in the /app/src, it must be pushed to a remote site.
If I want only to checkout the src directory to work on, it's important to create a special file structure in git? For example, instead of doing git init on app general directory, should I do git init on all subdirectories?
Is it possible to checkout only part of a file structure in git?
There are two directories A and B and a file F which is located in B. The working directory is B.How can you create a symbolic link in A pointing to F in B without changing the directory?
There are two directories A and B and a file F which is located in B. The working directory is B.How can you create a symbolic link in A pointing to F in B without changing the directory?
Unfortunately, I deleted my /home/ directory by running "rm -rf *" accidentally. The partition (/dev/sda3/) has an ext3 filesystem. After deleting the /home directory, I shutted down the PC and rebooted from a RIPLinux liveUSB, which has some tools that allowed me to recover some files. However, what I would like to do is to recover the directory tree structure, rather than the files, in order to see which files I deleted.
What I exactly want is the following: I would like to have the output of "ls -lR /home/" before deleting all the files, but the problem is that now the /home directory is empty.
My clearly outdated Linux course I'M using is telling me that the directory structure for building RPMs is in /usr/src/redhat, but on my redhat system, there is only /usr/src/ > debug & > kernels, folders.
I am running both Ubuntu and XP and have a local server for my computer on both systems. Both partitions have a www directory that is accessed when I type localhost into my browser.
I want to be able to work on the project in both systems and have the changes I make show in both. So my questions is how can I make "localhost" point to the windows www instead of the /var/www one when I start up the server?
I am using Lubuntu 10.10 and have installed my application that I need. What I need is a link to the desktop that will execute the program from the /usr/share/test directory. When I execute the program I need to use "Sudo" in order to run in. So my questions is how would I add a symbolic link the desktop with the approipiate permission to run the executable. Also I need to add a custom icon the desktop.