General :: Alias For Changing To Directory And Displaying All Files
Jul 4, 2011
I'm trying to setup an alias, that when I change to another directory, any directory, it will also display all its contents like ls -al:Well, that doesn't work. I guess it's an issue with the use of wild-cards.Maybe I should define a new, so far unused, name for the alias like cdl for example.Would be great if someone could help me. I search in several examples for bash aliases but couldn't find the right solution.
What command will provide you with the number of files in your current directory? Choose one answer. A. ls -c B. ls | wc -w (this one) C. ls -n | count D. ls -wc (this one ?)
I have directory a and directory b. They are big. b is almost identical to a. "almost" means that 4-5 files differ, and I don't know which they are. I want to copy b over a, but only the files that differ. i'm in bash.(no, I can't simply delete a and replace it with b, because 1) a is version-controlled 2) a full copy (or a mv) would take too much. I want to copy only the files that differ).
I would like to use something like a 'cdl' alias that would cd into the directory i choose and then ls the contents automatically. I find myself using ls after i cd into a directory all the time. Something like:
the problem is that i want to alias my cd cmd in such a way that whensover i enter any directory, a ls cmd i automatically performed. i tried ' alias cd='cd $1;ls' , but it is not working.
I have searched the forum high and low for the solution with no success, so I will now post this problem, with all known facts. Linux (and Fedora) is brand new to me so I'm somewhat illiterate with the language and recommendations from reading other threads. Please bare with me. I'm reading the book Beginning PHP and MySQL from Novice to Professional by Cristian Darie.The book has you create an Alias directory for creating the tshirtshop web-based application.
The book uses the directory /home/username/tshirtshop. However, I did not want this in the /home directory, so I created a new directory from the root directory /workspace/tshirtshop. Below are the areas of interest in the file httpd.conf (I restarted the httpd service each time I edited this file):
I my trying to install Ubuntu onto a new 1 TByte Sata HDD on my Dell 8400. The computer has no other OS installed as the old HDD containing Windows XP has being removed. The new hdd has being fully formatted and is set up as a NTFS with a single partition of 1 TByte. I downloaded the latest version of UBuntu and burned it to a cd as a ISO file. When I boot my computer up from this cd I am asked to enter my language of choice and then I get the main menu screen. On this screen I choose option (2) - Install Ubuntu. After a few minutes during which time you can hear the cd being accessed the screen just displays this main menu and never changes. I have tried twice using this install option but after 45-50 minutes the screen still never changes. No progress information is given. If it is normal how long does the installation generally take?
I don't know if this should be a followup to my prior topic [URL] ....
Each of the pieces I've installed all have an "Alias" directive in the conf file to link the directory where they live to be present on my server. For instance, DotClear lives in /usr/share/dotclear/web/ and there is a directive
Code: Select allAlias /dotclear /usr/share/dotclear/web that directs http://myserver/dotclear to that site.
Now, I've set up VirtualHost entries for my DotClear and Owncloud with their own hostnames.
The problem is when I go to [URL] ...., I get to my mythweb site.
This is not so good. So, for the sites that have their own hostnames, I removed the "Alias ..." directive. Of course, now I can't get to the hosts by going to the primary site which is probably fine, but I also still get my mythweb since that doesn't have it's own virtualhost entry.
This doesn't seem like correct behavior. Is there a better place to put the "Alias ..." directive so that it only works from one site and not all of them?
I am also thinking I should just link the directories into /var/www/html, but I'm not sure that's a better solution.
This should be a simple thing to accomplish, but I can seem to figure it out. Essentially, I want to have a bash alias or function that will let me recursively grep the current directory. A while back I added this to my .bashrc:
Code:
alias rg="grep -r --exclude=*/.svn/* --exclude=*.swp"
This works fine, (and also ignores any svn and vim swp files), and I can call it like:
Code:
rg foo *
However, 99.999% of the time, I am only interested in searching in the current directory, so the "*" is a bit redundant. Also, I would say 5-10% of the time, I am typing faster than thinking and forget the "*", so grep just sits there trying to read from stdin. It's a pretty minor thing, but ideally I'd like to be able to just type:
Code:
rg foo
I've tried creating a function to handle this:
Code:
function rg(){ grep -r --exclude=*/.svn/* --exclude=*.swp $1 * }
but it behaves exactly the same as the alias above. escaping the "*" with 's doesn't work, and neither does trying `pwd` (or even a hard-coded path) in its place.
I have a LAMP environment on Ubuntu 9.04 desktop. I created a directory in /home/user that I use for all my development work called /projects.
I am able to access this directory via localhost/projects as the URI in my web browser; I see a listing of all the separate directories and I am able to click on each directory and see its contents.
I have default, default-ssl and projects in my /etc/apache2/sites-available directory. I haven't changed these files since I first created them. I haven't changed the permissions on any of the directories within /home/user/projects since I first set this up.
What can I do to access this directory from the address bar of a browser again?
I want to run a cronjob every 15 minutes that checks a directory for files. If the directory contains more than ten files I want it to send an email to me.
All I have is this...
*/15 * * * * ls -l | wc -l | [filename] | mail -s "This is just a test" [email address]
I would rather not write a bash script. Is there an easier way to do this? I was looking into some commands like find and grep.
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
and my current working directory is sub1link, is there a quick way to either: change directory to link source parent (i.e something similar to cd .. but take the user to /dir1/ change directory to link source (i.e switch from /dir2/sub1link/ straight to /dir1/sub1
If I have a directory /foo with a few files in it, how do I symlink each entry in /foo into /bar/? For instance, if /foo has the files a, b and c, I want to create three symlinks:
When I setup the server I added multiple ips. Now that I need to edit ip info I can't find any alias files in the network-scripts folder. But restarting server the ips work fine. Is there some where else that this would be stored.
How would i go about copying files to a directory, yet skip the files that already exist in the directory, and also remove the files that are in the directory. For example:
Code:
$ls /dir1 img001.jpg img002.jpg
[code]....
Now i would like to copy from dir1 to dir2, but the contents of dir2 would be:
My desktop is not displaying files that I put on the desktop days ago. I checked all four workspaces.All the files are still accessible from Nautilus > Desktop.When I drag a file icon (from Nautilus to the empty desktop) and release, the icon does not drop on the desktop, the icon quickly moves back to the folder where it came from.The top and bottom panels of the desktop still operate normally.I am running Ubuntu 11.04 64-bit with classic-Ubuntu desktop.I also noticed that Startup Applications is missing "chrome" that I added this morning.I am the only user on this system. The system is a duel boot Ubuntu - win7. The Ubuntu partition is formated ext4 and there is a separate NTFS-formated partition named "storage" for folders that are accessible from both Ubuntu and win7. That should not be an issue because the desktop folder is on the Ubuntu partition, /home/[user]/Desktop. I have not booted win7 in the last few days.The desktop first appeared empty after I did two things:
1) installed CryptKeeper from Ubuntu Software Center 2) Enable Automatic Login
After I clicked Applications > System > CryptKeeper, the desktop did not respond to mouse clicks. I did not find any CryptKeeper instructions. I don't know how to use CryptKeeper.The login screen still asked for credentials after I rebooted.So I uninstalled CryptKeeper, disable Automatic Login, and rebooted. But the desktop is still not displaying the files.Could CryptKeeper have changed something in my profile?Does Ubuntu have a log file to see exactly what I did?What else can I try to restore my desktop and enable Automatic Login?
I want to change the application that opens my mp3 files.
At present if I double click an mp3 it opens in the movie viewer for some reason. I know I can use "open with..." to get a file to open in Rythymbox instead, but I have over 800 mp3 files and want to change the default for all of them.
i have inherited a mixed bag of sorts: several xp users updating an access mdb with the BE on a lamp stack shared via samba. i have a backup device which gets mounted at: /media/disk... each client record (has) a folder by the companyname on the samba share, and all relative documents are placed there. when the backup script runs, it just copies newer or missing files.
someone has been renaming folders, and not matching the folder name to the related companyname from the mdb. so...the backup script captures and duplicates the data in the renamed folders. some client records also have periods in the name (not required from a data pov), such as 'Company Ltd.' instead of 'Company Ltd'. i can produce a list of company names as the folders should be found easily enough, but get a little stuck with the linux scripting.
i can easily remove and further prevent any unwanted punctuation in the company name on the client record, and create the correct folder name on the samba share with vba, but would also like to:
-for each 'client activity' folder on the backup device -rename the folder by removing punctuation marks or -delete the folder if is a dupe
i tried: ls -al | grep '&' - it properly returns only those lines with an ampersand in the folder name, but returns all folders when i try that with a '.'.
what would be the easiest method to do the renaming? i thought if there was a way to change ownership of the mounted device, then the vba code (easy to write) would be simple.
OK - i just ran chown -R on the external device, changing ownership to (me) instead of root. didn't want to because it took too long, but can now use the MoveFolder method of the filesystemobject from my app to do the renaming instead of some sort of bash script (which i was dreading).
When I try to load a php file my browser downloads the file instead of displaying the page. Here is my apache conf file. # # Based upon the NCSA server configuration files originally by Rob McCool. # This is the main Apache server configuration file. It contains the configuration directives that give the server its instructions. # See [URL] for detailed information about the directives. # Do NOT simply read the instructions in here without understanding what they do. They're here only as hints or reminders. If you are unsure consult the online docs. You have been warned. # The configuration directives are grouped into three basic sections: # 1. Directives that control the operation of the Apache server process as a whole (the 'global environment'). # 2. Directives that define the parameters of the 'main' or 'default' server, which responds to requests that aren't handled by a virtual host. # These directives also provide default values for the settings of all virtual hosts. # 3. Settings for virtual hosts, which allow Web requests to be sent to different IP addresses or hostnames and have them handled by the same Apache server process. # Configuration and logfile names: If the filenames you specify for many of the server's control files begin with "/" (or "drive:/" for Win32), the server will use that explicit path. If the filenames do *not* begin with "/", the value of ServerRoot is prepended -- so "/var/log/apache2/foo.log" with ServerRoot set to "" will be interpreted by the # server as "//var/log/apache2/foo.log". #
There are millions of files in many directories. Wherenver i try rm * or find or use xargs, they say 'argument list too long' and exit. How can i deleted files in a directory with so many files without deleting the directory itself.
I was doing a tutorial on scripting in bash. I saved my file on the desktop and I cannot seem to get to that file to execute it. Here is what I have been using:
I try cd Desktop says that there is no such directory.
I have an Ubuntu 9.10 server set up at my house. I have Apache2 and PHP5 installed on it. Every time I go to the server on a web page and try to load the PHP index page it downloads instead of displaying.
I have virtual servers set up and have the files stored at /home/cusinndzl. If anyone needs to take a look I can let them into the webmin panel.
after i solved the many other problems that plagued me as an early adopter of the new iPod touch, I took my freshly recovered device (iOS 4.1, on an ipod touch 4G) and tried syncing with Rythmbox and Banshee. It shows up on the GUI as a music player and a camera, I press sync on rythmbox or banshee and it works out. But when the sync is over and I look at my device I find no songs on there, but they're inside the itunes_control file in my ipod when I view the file contents from ubuntu.
I have succeeded in syncing with iOS 4.1 before, but an error forced me to restore my iPod. But that was when I had a jailbroken ipod with the DBversion changed to 2 as per many guides on the internet. But I thought all of these problems were solved with the new libimobiledevice 1.04.