After successfully creating a KVM virtual karmic system, I can connect to it by ssh but the initial login does not recognize my initial password as specified in the cfg file by Code: [DEFAULT] ... user=myusername name=MyUserName pass=password ... By examining the output of Code: vmbuilder kvm ubuntu -o --debug -c ~/vm1.cfg > vm.debug 2>&1 I find where the password is set; it is the correct password. The login dialog in ssh responds with "Permission denied, please try again." Also happens with Code: ssh -l username 192.168.122.1 Rebuliding with no password specified defaults to 'ubuntu' but this does not work either. Did I miss something?
That IP (192.168.122.1) is the IP on the bridge for the host system, and so naturally enough I can login with my own password on that machine. But, when I rebuild the VM with a local address (192.168.2.55) configured, and move the bridge to eth1 (which is 192.168.2.1), the attempt to connect with the VM via ssh is refused. As an aside, there is "no console available for the domain" when I try to use Code: virsh # console VM1 So I'm still stuck with a VM that I can't communicate with.
Sorry this is turning into a monologue... I got around the login problem by allowing ssh to go out to the internet then back in to the VM. All's fine there, but I'm left with 2 questions. First, I'd like to find a better way to make the ssh connection. This is more of a Shorewall question, however. Second, how do I set up for a console session onto the VM? I can't seem to find any documentation on this, and I can't get the console or ttyconsole virsh commands to yield any useful information. Anyway, it took the better part of the day, but I seem to be up and running a karmic KVM system inside a karmic physical server. Now to install something useful... And thanks to Falko for the tutorial