Server :: Alternative To SSL Certificates / Make Self-signed Certificates Work On Most Popular Browsers Without Being Flagged?
Aug 24, 2010
I run a web server on Fedora 12, principally using Apache, MySQL, and PHP. I host a variety of sites, one of which is a family website that contains semi-sensitive personal data for several hundred extended family members, who all have access to the database-driven site.
Until now, I have been using a self-signed SSL certificate to encrypt the data as it is read and written back and forth from my database. Family members have simply had to put up with clicking past certificate warnings as they enter the site, as most browsers flag self-signed certificates as bad. It hasn't really been that much of a bother, but I'd love to do it more professionally. I have looked into buying SSL certificates, but it's a site I host for free and would rather find a cheap or free alternative if possible.
So I'm just fishing for ideas to work with. What are some alternatives to using SSL certificates for moderately strong website encryption? So far, I run only one host on the domain, but may eventually need encryption that would support multiple hosts. Or does anybody know a way to make self-signed certificates work on most popular browsers without being flagged as suspicious?
I have a Server with Webmin, Usermin and Sendmail using pop3s. I have created a seft signed certificate using webmin. Exported it and imported it to the trusted root certification authorities on my client. This fixes the warning message from internet explorer when attempting making a ssl connection to webmin. When attempting to use usermin or retrieving mail I get that warning that this site's certificate is self signed. I look at the certificate and its not the same as the one I created with webmin. My question is. Is possible to have the same certificate be used by each?
i am using red hat5 n i want to create X.509 certificates for ipsec vpn help me in creating certificates, not able 2 create certificates guide me ehere is the location for certificates.
We have a web server running apache and a custom web app that we log into from a web browser and it ask you to except the certificate and all is well. I now have an user who is using a window server 2008 and he wants to manually import the *.cer file into his browser to be able to login. My question are:
1 - What is the file that is being imported into the browser? *.pem *.crt
2 - I see on our server that we have our certs I believe located in the /etc/pki/tls/certs. The openRADIUS servers that I have created, this is the directory to where it is stored.Is this the typical placement for certs.
3 -If the files is a .cert or *.pem than could I use openssl to convert them to the appropiate *.cer file for IE7
I have openVPN working with a thirdparty CA, and validating UID entries from the client certificates in LDAP groups. My next step is to figure out OCSP to make sure revoked certificates are denied. I could dump out my CRL as a nightly job, but that of course presents a window where a revoked certificate is still valid. how to dump out client certificate back to pem format? For the ldap check all i was using was the DN, which doesn't really help me for openssl/ocsp
I've installed PostgreSQL on Arch Linux & also self generated self signed certificates in /etc/ssl/ directory. My PostgreSQL 'data' directory is /var/lib/postgres/data & I've edited my postgresql.conf file to use SSL however I'm having permission / access problems starting my database using SSL. It can't access the certificates and errors out when I try and start the database engine:
Code: LOG: autovacuum launcher shutting down LOG: shutting down LOG: database system is shut down FATAL: could not load server certificate file "server.crt": No such file or directory code....
I don't know what I need to chown or chmod in order to get PostgreSQL to access my self signed certificates.
I'm trying to set up a 2nd SSL cert on a different domain on a server, each domain has its own IP address, the problem is the Web developer that configured the first domain specified ssl keys for the primary domain in both the vhost config in httpd.conf AND in the ssl.conf config files. If I attempt to remove the keys form ssl.conf the server will not start up. and with them there It will not start up if I specify keys for the secondary domain.
I have installed Ionix vCM onto a Red Hat Linux box. It correctly communicates with the collection server if I use the Ionix certificate. However, if I use a self-generate certificate, communication fails.
(1) How do I determine which PKI certificates are resident on the Red Hat box?
I have vsftpd running as FTP server on Ubuntu 9.04 jaunty. Login works correctly with password for local users (those with an login account on the server) and without password for anonymous.
I want to further tighten security by requiring local users to provide a client certificate. But even if I include "require_cert=YES" and "validate_cert=YES" in etc/vsftpd.conf, clients without certificate are allowed to login; require_cert seems to be simply ignored.
I run couple of sites on a virtual hosting environment and I am in need of adding additional SSL for a different domain name. From what I read on some forum topics indicate that SSL cert requires different IP address. meaning one cert for each IP. Is this true? If so, then I'm having some difficulties understanding the benefits of running virtual host if a server can't host multiple secured site through single IP. Any way to run multiple ssl site within virtual host environment. I'm hoping for a possible workaround.
I am having problems creating ssl certificates for use with openLDAP. Does anyone know a good centos tutorial as I am having problems finding ones by searching through google and the forums.
To clarify further I have a small network im trying to setup to use ldap for auth due to the size I figured using kerberos for auth would be a bit overkill.....
I have the server up and running fine however at the moment all auth is done by using clear text (which is fine as the network has no connection to the internet at current) however in the future it will so I am trying to use ssl however I am having confusing as which certificates I point to where in the slapd.conf file
My sendmail server makes use of the TLS_SRV_OPTIONS which is set to `V' meaning it shouldn't verify certificates. As a server, it doesn't and the {verify} macro shows "NOT" in the logs, showing that no certificate request was sent out.
Acting as a client though, and I'm talking both about the server acting as a client towards other mail servers and about the local mail submission agent, it always verifies certificates. My mail submission agent when contacting my own mail server verifies the mail servers' certificate and still, the mail server has not initiated any exchanging of certificates since it still says "verify=NOT" in the logs (whereas the same entry for the submission agent reads OK or FAIL depending on what I use).
So, does mail servers ALWAYS send out its certificates and when they do, the "client" in question (no matter if it's the mail server acting as client or the mail submission agent) validates it because the TLS_SRV_OPTIONS setting just applies to when it's running as a server, or is there a setting to tell Sendmail not to send out certificates since you're not in the business of certificate verification relaying anyways?
I've recently been asked to setup our FTP server to accept connections from a remote host. They sent me a file "id_dsa.pub" with instructions to add this key to the xfer user.
When I use GMail with Konqueror, it switches to basic HTML becasue "my browser isn't supported".
Recently I tried Google Reader in Konqueror. It said that my browser isn't supported and Google Reader won't work. But it works perfectly.
I wonder if this means that GMail would work if not for their artifical limitation of checking which browser you have? Is there a way to circumvent it?
I am real tired of getting those SSL errors when I go to my intranet. So I am now trying to generate my own SSL certs (go me). I have easy-rsa installed for my openvpn can I use that so sign the csr?
I have enrolled some certificates from my own ca, to use to a couple of different services, like FTP, WEB, Mail etc. All these certificates comes from the same CA (my own), and I have created a root CA. But is it possible to import this root CA to the whole system, so I do not have to import the certificate or root CA to the different applications like iceweasel/firefox, chrome, icedove, filezilla etc.
I have the following problem with konqueror. Eveytime when I am trying to enter https://localhost:10000 (this is webmin) or https://localhost:631, konqueror asks me in a popup the following (translated from german):
"The authentification of the server has failed.The certificate does not suite to the server. The certificate has not been signed by a trustable authentification authority"
Then I press "continue". Then the next popup appears asking me: "Do you want to always accept this certificate without any request" And the possible buttons are "always" or "only this session". The problem is that I always press on "always" but obviously konqueror is not remembering this certificate since I have to press all the buttons a hundred times in the ongoing session and every new session. In firefox, I was only asked once and the certificate was stored in list. Does anyone know how to fix this problem in konqueror??
I just installed Citrix to my computer but when I try to use it I get an error message saying: "You have not chosen to trust "Equifax Secure Global eBusiness CA-1", the issuer of the server's security certificate (SSL error 61)."
So I downloaded the certificates to allow me to use it but I am unable to copy them to the /usr/lib/ICAClient/keystore/cacerts/ directory, I cant download them straight to that folder either. I have administrative privileges but still I cant do anything with the files in those folders other than look at them. How to put files in those folders?
Does anyone know how to configure an SSL cert with GoDaddy? On the following squid page it seems to use x509 and PEM format for everything. GoDaddy seems want CSR files to issue the cert. The x509 & PEM combo don't seem to generate these CSR files in the correct format. Does anyone know the openssl commands to generate the files and the config line(s) to put in squid.conf?
I started from this wiki: [URL] I also tried following this godaddy wiki, but it was for apache and not squid [URL]
Is there a guide somewhere that covers all the security module topics for Linux, somewhat from top to bottom. Such as LDAP TLS RSA secure auth... generating certs etc etc. All of it and how it all ties together. Sure I can find you should use this etc., or guides that don't explain much or how they work together to complete the sweet. TLD seems to suffer from the same thing that I just stated...
After the Sun Jan 24 20:22:46 UTC 2010 update in slackware-current (x86), I am unable to store SSL certificates until "Forever" when asked if greeted with an unknown certificate under KDE. No application can save the certificates eg.: konqueror, kmail etc... I am aware that the above mentioned update didn't bring any updated KDE applications/libs, but still, this has stopped working right after this upgrade. The certificates can be accepted, and after doing so everything works as expected. The only annoying thing, is that although I have selected to trust the certificate "Forever", it asks again after an application restart (eg. konqueror, kmail) if I want to trust this cert forever or for the current sessions only. So it seems, that I can not store/save/trust the SSL certificates forever with KDE.
I'm trying to install Debian Jessie, but the installation keeps failing when installing the ca-certificates package and then asks for a media change to the disk that is already in the drive, and keeps asking even though it's already in the drive.
I've just bought a Linksys WRT610N router and I ran through various problems during the configuration, that brought some questions.Here is what I did to configure it (following the short manual that I got with the router)1. plugged the router in my modem and in my computer via ethernet cables2. entered its IP address (given on the manual) on my browser and logged in with the factory login3. changed the login passwordAfter this the problems that I have encountered are that:I set up the administration of the router to be disabled via wireless and enabled locally via https, but when saving those settings I either lost the connection (the browser telling me the server was not accessible) or asked confirm a security certificate after being (logically) redirected to the https version of the administration pageafter trying to loggin again, I wasn't able to login via https but only via http even if after logging those parameters were still as I set them (wireless administrative login disabled and local administrative login enabled only via https)
via https when getting something else than "the server is not responding or could be too busy", I was prompted the untrusted connection site, saying that"192.168.1.1 uses an invalid security certificate.The certificate is not trusted because it is self-signed.The certificate is only valid for Linksys.The certificate expired on 01/01/71 01:21. The current time is 19/04/11 22:56.(Error code:sec_error_expired_issuer_certificate)"I noticed that after loosing the connection and not being able to reach the router either with http or https, the only way I was then able to reconnect to it was to go into (I am using firefox 4 on squeeze) edit > preferences > advanced > encryption > view certificates > servers and delete the linksys certificate
Will this impact my Debian system any or will it work fine? Do you guys have any experience with this? I would rather try generating them myself and change it from 1024 to 2048.
I updated yesterday. Main change was from kde 4.5 to 4.6. Since then when I start kmail I have always a message about the certificate not applying to the given host. I use kmail to connect to a dovecot imap server. Everything worked fine before. I know very little about certificates. I tried to generate again the certificates (running /usr/share/doc/packages/dovecot/mkcert.sh) but I don't know what else to do.
I am in the process of securing our web server (apache) using openssl generated certificates. Is it possible to generate a certificate for both www.example.com and example.com?
I was wondering if it is possible to have different certificates for different directories in a https-directory ?So what I want is that for a specificry a specific TLS-certificate is needed by the http-client to be authorized to the directory.Directory /var/www/html/secure/1 needs a certificate A.Directory /var/www/html/secure/2 need a different certificate B.So I have 1 CA, which signs the other certificates of the specific directory. The http-client gets the certificate A or certificate B (to be authenticated for secure/1 of secure/2)