PureDisk 6.6.1 Central SPA in ESX 4.1 (5 Nodes)
Hi all, IHAC that has configured a central SPA (5 nodes non HA) in ESX4.1 (3 ESX Hosts) and has 80 remote office locations replicating to the central SPA. Node breakdown as below 1 SPA, MBE,CR 2 CR,MBE 3 CR,MBE 4 CR 5 CR Each node is configured with 4GB RAM, and 4 vCPU's, all networks are on a flat virtual machine network. We do support PD in VMware as per the statements below, however, has anybody experienced this configuration in the field/have any comments,experience to share, best practice statements etc. My personal view is that this should not be configured in VMware due to the storage performance of VmFS, and non RDM disks. Notes from the compatibility matrix. There are three use cases for running PureDisk Remote Office Edition on an VMware ESX server. Depending on whether the ESX server is dedicated or shared, we will take advantage of the virtualization capacities of the ESX environment to setup multi-node PureDisk environments on 1 physical ESX server. Remote Office Shared ESX Infrastructure ESX Infrastructure: shared between PureDisk and other VMware guests All-in-one PureDisk server with a storage capacity of 500 GB to 1 TB PureDisk VMware guest running in a VMDK file on top of a VMFS partition on the internal or direct attached storage Typical client agent count: up to 10 systems Remote Office Dedicated ESX Infrastructure ESX Infrastructure dedicated for PureDisk VMware guests Multi-node PureDisk server with a storage capacity up to 8 TB 3 to 4 PureDisk VMware guests using Raw Device Mapping to access the storage Datacenter Dedicated ESX Infrastructure ESX Infrastructure dedicated for PureDisk VMware guests Multi-node PureDisk server with a storage capacity up to 24 TB 3 to 10 PureDisk VMware guests using Raw Device Mapping to access the storage General Hardware Requirements Please refer to the following compatibility guides for ESX compatible hardware and software: The minimum hardware requirements are specific to each use case and listed below. For each vCPU a physical CPU core should be available on the dedicated ESX servers. Hardware Requirements Shared Remote Office ESX Server The PureDisk all-in-one VMware guest requires the following ESX resources:2 vCPU 4 GB RAM 500 MB to 1 TB on the VMFS storage partition Hardware Requirements Dedicated Remote Office ESX Server The PureDisk VMware guests require the following ESX resources:4 to 6 vCPU 8 to 12 GB RAM up to 8 TB of internal or direct attached storage accessed using Raw Device Mapping Hardware Requirements Datacenter ESX Server The PureDisk VMware guests require the following ESX resources:4 to 16 vCPU 8 to 32 GB RAM up to 24 TB of FC SAN based storage accessed using Raw Device Mapping840Views0likes3CommentsPDVA 6.6.1 - Unable to set FQDN
Have just imported the Virtual Appliance of PDVA 6.6.1 and having got through to the setup (after logging in as root), it always fails on setting the network up with the error unable to set FQDN. Having been through numerous iterations of setting different FQDN's i cannot get past this point (static settings or DHCP). exiting the wizard and running yast i can setup the VM network card with dhcp or static address and can access the local DNS/DFG/DC's etc without issue. My question is has anyone got to this point and failed before?272Views0likes0CommentsIt shows that you are trying to install it on 32 bit wheres it is 64 bit
I have a esx 4.0 runnning on the vmworkstation 7.0. When I created a new vm and moutned the puredisk ISO it says that it is 32 bit and needs to be 64 bit while running the setup. My host machine is 64 bit running on I5 450 processor. The puredisk iso is 64 bit iso.384Views0likes2CommentsPDVA Install Problems on ESX 3.5
I got PDVA 6.5 up and running as a test no problem. Now I am building PDVA 6.6 and having issues. I imported the virtual appliance successfully into vmware and verified the settings. The OS would partial boot and then the vm would shut down. Error log reported NIC incompatibility. I noticed the OS was set to "Other" and the NIC was "Flex". Changed NIC to vmxnet and system booted successfully. I registered A, C, and PTR records in DNS and attempted install. Yast runs and I fill in the Network Config. Choosing "Next" gives me the error "Setting the SPA's FQDN Failed. This is a preconfigured Virtual Appliance. I would expect it to work out of the box. What am I missing? Thanks.Solved859Views0likes6CommentsMove MBE and SPA to new Server
Has anyone gone through the process of moving their MBE or SPA to a new physical (or virtual) server? I originally built an All-in-One PD Server, with 1 TB of internal storage. My environment has grown past the intial server, and I now want to move the SPA and MBE to different p or v machines. The long term goal is to re-purpose the original PD server. Thanks in advance for any insight.305Views0likes0Comments