Ubuntu Servers :: Multiple Connections To SMB Server
May 25, 2010
I'm curious how I can use a Windows client with two separate accounts to connect, at the same time, to a SMB server hosting two shares. (Provided permissions and accounts are all in order)
Scenario:User1 is always logged onto a Windows client mapped to a Public share on a Linux SMB server.I need a way to keep User1 connected to the Public share and then when needed, allow User1 to provide User2's credentials to connect to a Restricted share.The only way I've been able to do this is to disconnect from the Public Share then reconnect to the Restricted share using User2's credentials. (This is the issue because I need to keep User1 connected to the Public share).Is this a limitation of SMB? Or am I missing a configuration? Please point me in the right direction
I'm looking at setting up a couple automated systems: Here are a few examples:
* Internal accounting system to download and process emails * Public web server to visit
I could put each system on its own separate box -- for example, it's generally good practice to separate anything that external users have access to (such as a webserver) from internal processes such as accounting. Now, rather than dishing out the money for two separate servers, could I get away with just installing new instances of VMWare on the same box for each system?
To give you an idea, these are not large scale computationally sensitive systems. The accounting one is simply downloading and tallying emails, and the latter is just a webserver with maybe 5 hits per day on a good day. I could definitely pick up a new box for say $50, but I wanted to know the general practice of using VMWare on the same box versus two separate boxes.
(I have tried a bunch of other ips too and none outside its network are pingable) I'm not sure if this is a problem with my server or a problem with the networking outside the server. I have been emailing my server provider and they keep on insisting the problem is with the server and that their network is working fine. Apparently all of their other servers work and they can login into the gateway and ping 8.8.8.8 from there. So they just want to reinstall the OS, but I thought I'd post here to see if anyone has any ideas.
Here is some info I have gained while troubleshooting: I haven't changed any settings at all on the server for months. I haven't done any updates for about a week. The strangest thing is that this is intermittent, there have been a few times in the last 24 hours where I have been able to ping 8.8.8.8 or other ips, but 98% of the time I can't. I have also tried rebooting the server, which had no effect. I can ping the gateway, and I can ping other servers on the same subnet. I can ssh onto the server from my home internet connection, and I can view webpages on apache, so incoming connections work.
I'm curious if anybody can shed some light for me in this department. We're in a large environment with a Windows DHCP Server. We have been tinkering with LTSP on Edubuntu as thin and fat clients. It works great, but right now we just have 1 server handling the lab, which works fine unless we want to expand, which may be very possible.
These are the instructions I received: Login to your windows server and load the DHCP configuration screen Create a DHCP reservation for the MAC address you obtained Add the configuration options below to enable the machine to boot from the LTSP server 017 Root Path: /opt/ltsp/i386 066 Boot Server Host Name: <ip address> 067 Bootfile Name: ltsp/arch/pxelinux.0 # Specify CPU architecture in place of 'arch', for instance 'i386'
From: [url]
I'm curious, what if I want to have multiple Ubuntu servers on the network that I want to have bootable? For example, let's say I have 3 labs, and 3 servers. Server A to Lab A, Server B to Lab B, and Server C to Lab C. I want all C's computers to boot to C, and B to B, A to A, etc.
1 - How would I add multiple entries on the Windows DHCP Server to allow all 3 (A B C) servers to boot?
2 - How would I be able to isolate the clients so ONLY Lab A clients boot to Server A, etc?
I'm not sure if this belongs in the Server or Networking section of the forums. Anyway, last month I upgraded my server to Ubuntu 10.04 LTS. Since then, I've had a recurring problem wherein after a certain period of time, the server stops accepting network connections. Ubuntu 10 will continue to reject network connections until someone logs into the server locally, after which time network connectivity is restored and the cycle begins anew. Essentially, the server goes into a "half sleep mode". I say half because the computer is still on and the fans are running.
I've done some searching around various forms and initially figured this issue was related to problems with the Network Manager service (https://bugs.launchpad.net/ubuntu/lu...er/+bug/524454), so I removed the service altogether. However, this problem is still occurring.
I've poured over /var/log/messages and /var/log/syslog, but have noticed no irregular behavior. Has anyone else experienced this issues? I'd rather not resort to downgrading back to Gusty Gibbon if I can help it.
I am happy to provide more information if its needed
I have two ubuntu boxes. One is a 9.04 desktop edition and the other is a 9.10 server edition I am working on some code that needs to be highly tolerant of bad network connections. It sends transactions to a central database, but when the network is not available, it caches them locally to retry later.
I have the code working beautifully on my desktop box. but when I test it on this other box (the one running server edition) there is a HUGE DELAY every time it tries and fails to send a transaction to the database when the network is down.
I tested a little further, and I found that if i unplug my network cable and run ping somehost on the desktop, it fails instantly saying "ping: unknown host somehost" But if I unplug the cable on the server box and run the same ping command it lingers for about 40 seconds before the ping fails.
Does anybody have any idea why this might be happening? Is this a 9.04 vs 9.10 difference? Is this a desktop vs server difference? Is there some package I can install, or some config setting I can change that will make the server box insta-fail just like the desktop does?
create an Apache web server with multiple user accounts, for work. Each user needs to be able to upload his/her files via SSH Each user needs his/her own web directory, (preferably in their home directory for ease with permissions) There web directories need to be password protected Only one user account (mine) should be allowed remote SSH control. It needs to be easy to add new users to the system.
Recently I've been earning money doing web development, php, html/css, MySQL and so on. What I have encountered a lot are clients that need a complete solution. They need their site built, but they also need a hosting solution. I've sent more than just a few clients off to GoDaddy, and quite frankly, I'd like to cash in on some of that.
It would do wonders for my business if I could offer them a hosting solution with full support on top of building their site. My problem is I have no idea how to do this. So I'd like to know how I can host multiple sites on the same server. Does anyone know of a nice guide I can follow to set this up? It's really important that I can add sites fairly easily over the internet. Since I will be away at school, I won't have direct access to the server
I work for a college with many departments. I'd like to just deploy one LDAP/krb5 server (plus slave replicas) to authenticate all users in all departmentsIs it possible to do this?The proposed DNs for the departments matches what is done for NIS now.If anyone has any pointers or URLs that describe how to properly do this.
I am currently working on managing multiple linux servers in remote locations, servers particularly user for web hosting. I need to backup data to a backup server but rsync which i currently using doesnt helps is there any tool to backup every server with out modifying it bcos there are hundreds of servers so installing a tool in every server is time consuming process.
by now I have 10 servers for hpc, power computing oriented. My users need to launch several processes using qmake. The users are used to work with ubuntu 9.10, and the software from the repositories is switable for them. I've deployed ubuntu 9.10 to all 10 servers (pxe rocks). By now we work with parallel-ssh and cluster-ssh, which allows as to launch the same process to all servers. With this tools this tools the servers remain as independent but with the same software and the same launched command. Now we would like to go to next step and see all the servers as a single one with all the resources from the other 9 as if was its resources. The difference would be substantial in time to process and also time to design the command to launch.
Running Ubuntu Server 9.10.I have a couple of programs that I would like to start at boot, they will both run forever. I have created a cron job with @reboot and that will start a program, but if I have multiple programs in there it waits for the first to finish before starting the next.
Im moving all my websites on a dedicated box. I had a cpanel hosting account, and now moving to a terminal and ssh system.I need some advice on choosing my mail setup. I need POP, SMTP, with multiple domains, as host for a few clients.I would like the most simple version.The server will only send +- 100 mails / day.I currently running Ubuntu Linux 10.04.
I have recently setup an Ubuntu 10.04 Minimal x64 Server. I plan on setting it up as a mail server. I need a secure server, that has spam prevention on it. Im setting up around 50 domains and would like a web based control panel. What is the most secure mail server, that i can setup for multiple domains?
I'm trying to configure lighttpd to send SCGI requests to different ports, depending on what file(s) are accessed. Is this possible? This is what I've tried, and it hasn't worked.
I have multiple video streaming servers(Red5 running on machines internally on LAN. For different subdomains.Ubuntu 10.04 The front end to the is apache2 on a Bastion Host. To be able to reach the streaming server I embed a javascript in HTML pages as follows
Code:
<embed ..... var="rtmp://site1.my_domain.com" >
[code]....
how will I make sure this rtmp request is mapped to a port different than 1935 as there are three other streaming servers which are also to respond to their respective requests.
I really hope someone could help me with this problem. I've been stuck on this for a month.I am using the sftp command to upload files using a bash script.The problem is that it is extremely slow to do it this way. as many of you would know if you have shared server somewhere. I would use scp if remote server supported it, but it doesn't.
Anyway, If any of you have ever used FileZilla, in the Settings, if you go to "Transfers" there is a place where you can set the number of "maximum simultaneous transfers". This feature works wonders with SFTP (and FTP too). It really speeds things up. How do I accomplish this same thing with the sftp command... because I don't want to use a GUI. I don't even mind using FileZilla through the command line if possible... but it does not seem to be possible. I've been stuck on this for a month!!! I've searched everywhere and tried a lot of things with no avail...
I am total newby in Ubuntu 10.04. I have just installed it in my office where I have two networks card one connect to a router giving the internet access and the other connected to the windows based work network providing access to the work network sources. In XP everything works fine as I can keep both connections alive and have the results I want.
However although I don't know how to do it in Ubuntu 10.04. Till now I have setup the first connection directly to my router and I have internet access but I cannot set the other one. Another question is how I can force ubuntu to use the router connection as the default one when I log in.
A deamon say ssh will be listening on port 22. when a new connection is requested by the client, it will be authenticated and a new connection gets establihed with some port say 1025. And ssh will continue to listen on 22 for new connections.If I am correct then in my machine I observed following connections are establised to ssh port 22, As per my understanding connection should be established on a different port other than 22.
I currently am running 10.04.1 and have successfully setup my home web server to run a single website. My current settings are:
-I have registered the domain name annarrankings.com through godaddy -A record is - host = @ and points to = 71.114.220.3 -CName is - host = www and points to = @ -on my server I have the site running in /var/www
I've done some research and found that to run multiple websites I need to setup VirtualHost.
-So I created a folder /var/www/annarrankings.com and moved my site to that folder
-Edited Apache2.conf to add the following line
-I then went to /etc/apache2/sites-enabled and copied the default file to a new file called annarrankings.com. Here's the annarrankings.com file after I edited it
-I then created a link in /etc/apache2/sites-enabled to the annarrankings.com file in /etc/apache2/sites-available
-Next I editied /etc/hosts
-When i went to enable the site using a2ensite annarrankings.com I got the following
I figured this was ok since I had already created a symbolic link earlier (a result of trying to following multiple tutorials and ..... videos at once) so I reloaded apache2. I created an index.html file in /var/www/ just for testing purposes and when I load www.annarrankings.com I get the file located in /var/www/ instead of the website located in /var/www/annarrankings.com Do I need to change my A record or CName in godaddy or did I just do this completely wrong?
We have to install centos 5.5 in approx 60 servers and we want to have a server in which we can create an image of 1 server and deploy it on all other severs through pxe. Mainly all servers will be having raid 5 or raid 1 configured. So the utility should be having the raid support.
I have 16 linux servers that use /etc/hosts files to see and talk with each other. I'm adding servers to this pool of servers. It is required to do host resolution via the /etc/hosts files. DNS or NIS are not alternatives. Aside from manually editing each of the 16+ /etc/hosts files every time I add a server or editing one /etc/hosts file on one server then scp'ing it to all the other servers, is there anyway to edit the /etc/hosts on one server and "push" it onto the other servers that need the new /etc/hosts file?
Everywhere I've looked on the Net, there hasn't been any suggestion except for the options I mention here.
I'm running ubuntu lucid and i was thinking in purchasing one or more extra wifi cards to try to configure my computer to manage different conections at the same time, with different isp's. The thing is that I'm not quite sure if what i want to do is actually possible.
The easiest way that crossed my mind was to try to configure a / multiple virtual machines that are redirected threw proxies to ubuntu and try to configure that each proxie port goes threw a different internet gateaway. This way i might be able to divide threw different sessions of JDownloader, installed on each virtual machine, the things i want to download. The negative aspect of this idea is having multiple jdownloader sessions will make my laptop work to almost 100% for sure...
Another thought i have was to make JDownloader manage its downloads in only one session redirecting them to my internet conections; the negative thing is that i think i will have to try to modify its source and learn java...
And well my last possible configuration i had in mind was to try to make ubuntu directly add up all my internet conections manage as if it was one. the negative thing here is that i might not be able to get multiple downloads from some sites
Well, all this where just thoughts, im struggling whether to buy another card or not to try to setup any of this configurations but im not really sure if any of them are actually possible. Is there an easy way to manage this?
I just want to take the most out of my internet conections... if i'm at college i have to options that are quite slow, adding them up with two cards would be great, i might also be able to add a third and a fourth conection. Also if i'm on a coffe and i need some bandwith i could try to make it go with an open network arround, etc.
I have a question concerning mobile broadband connections: Is it possible to have multiple mobile broadband connections active at the same time?I currently have two cellphones connected to my Ubuntu box and both of them can successfully setup an Internet connection for my Ubuntu box (say connection X or connection Y), though not at the same time. Unfortunately both connections are very limited in speed (around 10kb/s each. And this is my only access to the Internet where I live). Therefore, I wish to use the two connections simultaneously to download files from the Internet (file 1 with connection X and file 2 with connection Y). Is this method of doubling my bandwidth possible in Ubuntu?
This is the current setup that we have: We have approx 20 clients who pay us to send out a type of e-mail called an E-Blast to their customers. We currently are using 5 Microsoft Windows Virtual Servers to do this. The problem is that those machines are starting to break down. There are times that it will take Microsoft Windows approx 9-10 hours to complete 1 job. This is way too long. We want to move away from Microsoft Windows for this particular type of job as it seems there are more customers who are wanting to use this type of advertising.
It seems that using a Linux Server "Command Line or Shell" environment would be the best way to go as there is no GUI like Windows. Since there is just text...that is something that would/should process very, very quickly.
I am in the process of setting up a new SMTP outbound mail server. This is the current software & configuration (what is installed on this new machine):
All of the customer data (Names, E-Mail Addresses, etc that these e-mails are going to) are currently loaded in a Microsoft SQL Database.
My machine that I am using is plugged into the DMZ. I have 1 ip address for the 1 network card. I have also added/bound 4 more ip addresses to that network card.
I have configured Postfix for Multiple IP Addresses.
I can, from the command line, send successful test e-mails and receive them in my personal account.
As far as I know everything is setup correctly. I can and will post requested information so that it can be verified that everything is setup correctly.
Here are a couple of my questions:
Ensure that I have my Network / Interfaces file and my Postfix's Master.cf/Main.cf files setup correctly?
How can I setup this server to be an Outbound SMTP server and get it to use all 5 of the IP Addresses to send these e-mails quickly?
What can I use to check and ensure that this server is in fact sending out emails on all 5 IP
Addresses (I heard that there is a program named "Postal" that may help in determing this).
I am tasked with setting up 3 out of the 6 servers and dividing up 500GB of space in the most efficient manner amongst the 3 servers. The space is in a pool which can be assigned to virtual drives. Each virtual drive can be assigned as disk0 or disk1 and so on to one or more servers. They'll be running CentOS.
On the second try I came up with this scheme: shared sda1 -- /boot (ext3) shared sda2 -- /home (ext3)
I have currently 7 servers that report logwatch every day.Fact is that it's lot of information to process every day, I would like to have as short as possible overview of events happened in last 24h is only critical/warning information . It would be + if all servers output could be gathered in 1 email
I got 2 ADSL accounts from a provider, so I decided to configure a server as a gateway for my other PCs.. I created ppp0 device using pppoe-setup over eth0. Then I configured the second one as ppp1 over eth2 to the second modem.. When I finished, I used ifconfig to check the settings and I got only ppp0 and didn't see ppp1, I tried to ifup ppp1 but I still get one device with ifconfig.. The adsl-start command starts only ppp0.What should I do to get the 2 lines to work simultaneously?Are there any configuration files that need to be edited?