AHV Deploying FortiAuthenticator

  • 14 March 2019
  • 4 replies

I'm trying to deploy FortiAuthenticator v5.5 VM but running into issues on AHV.

I've tried creating the VM using the files from Fortinet but no instructions when using AHV per this guide:

The files I've used are the following:
  • KVM - fackvm.qcow2, datadrive.qcow2
  • Vmware - fac.vmdk, datadrive.vmdk
  • screenshot attached of my options from Fortinet
  1. Created vm adding cloned disk using the files above
  2. Set vCPU, Mem, added NICs
Fortinet suggested I use the vmdk files but still getting the error in the screenshot. I'm new to AHV so any help would be appreciated.


Best answer by MMSW_DE 15 March 2019, 14:38

View original

This topic has been closed for comments

4 replies

Userlevel 3
Badge +10
AHV isn't able to read the vdisk, so there's likely a driver issue.

Ask the company if they have any way to put virtIO drivers into the image.
If you need to, you can go as far as deploying this to VMWare workstation or Oracle Virtualbox and trying to install the virtIO drivers yourself, (Or just install Nutanix Guest Tools) then export the the VM (in VMDK format) and clone from that disk. I think if you do this, you have a higher chance of getting it to run.

If you have a dedicated VMWare environment, you could attempt to use Xtract against the system and it will make sure it places the necessary drivers. really you could do the same thing I suggested on VMWare workstation as well.
Userlevel 3
Badge +6
We do not use FortiAuthenticator but have successfully deployed FortiManager and FortiAnalyzer VMs to our AHV cluster. I just gave the Forti Authenticator 5.5 KVM image a try and got the same error as you did.

Forti Authenticator 6.0 KVM image seems to work though. At least it boots to a login prompt.
Thanks @ddubuque & @MMSW_DE for the quick replies. I was OoO for a week and just now was able to use v6 which was released 3/14. I too am able to get a login screen now; hoping the rest of the config goes smoothly.
FYI - FortiGate published KB explaining the root cause for some of the FortiStuff failing on Nutanix / KVM. Have a look here -

This is due to FortiStuff (Manager, Reporter) failing to recognize SCSI disks presented by Nutanix. The trick is to use PCI disk type. Make sure you upload qcow2 image manually on the Node (using Element Manager rather then Central) and drop it to right storage container. PRISM Central will drop the qcow2 image to SelfService and once cloned new appliance will stay on SelfService storage bucket as well!

quick KB FD40080 summary below:
Since there is no storage and the boot is happening from a read-only qcow2 image, then any changes made to the VM would get lost after the reboot.


The problem is due to the fact that the disks are attached to this VM over SCSI bus. The problem is exactly the same even if the image is deployed on KVM hypervisor.

To resolve the issue, do the following:

1. Create / Update the VM and remove all the disk (if they are already attached)
2. Attach the boot disk (2GB) as PCI
3. Create a new data disk of minimum 30GB and attach it over PCI bus as well
4. Attach a NIC to the VM
5. Boot the VM