vxconfigd core dumps at vxdisk scandisks after zpool removed from ldom
Hi I'm testing InfoScale 7.0 on Solaris with LDoms. Creating a ZPOOL in the LDom works. It seems there is something not working properly. On the LDom Console I see May 23 16:19:45 g0102 vxdmp: [ID 557473 kern.warning] WARNING: VxVM vxdmp V-5-3-2065 dmp_devno_to_devidstr ldi_get_devid failed for devno 0x11500000000 May 23 16:19:45 g0102 vxdmp: [ID 423856 kern.warning] WARNING: VxVM vxdmp V-5-0-2046 : Failed to get devid for device 0x20928e88 After I destroy the ZPOOL, I would like to remove the Disk from the LDom. To be able to do that I remove and disable the disk /usr/sbin/vxdmpadm -f disable path=c1d1s2 /usr/sbin/vxdisk rm c1d1s2 After this I'm able to remove the Disk from the LDom using ldm remove-vdisk. The dmp configuration is not cleaned up. # /usr/sbin/vxdmpadm getsubpaths ctlr=c1 NAME STATE[A] PATH-TYPE[M] DMPNODENAME ENCLR-TYPE ENCLR-NAME ATTRS ================================================================================ NONAME DISABLED(M) - NONAME OTHER_DISKS other_disks STANDBY c1d0s2 ENABLED(A) - c1d0s2 OTHER_DISKS other_disks - # If I run vxdisk scandisks at this stage, the vxdisk command hangs and the vxconfigd core dumps: # file core core: ELF 32-bit MSB core file SPARC Version 1, from 'vxconfigd' # pstack core core 'core' of 378: vxconfigd -x syslog -m boot ------------ lwp# 1 / thread# 1 --------------- 001dc018 ddl_get_disk_given_path (0, 0, 0, 0, 66e140, 0) 001d4230 ddl_reconfigure_all (49c00, 0, 400790, 3b68e8, 404424, 404420) + 690 001b0bfc ddl_find_devices_in_system (492e4, 3b68e8, 42fbec, 4007b4, 4db34, 0) + 67c 0013ac90 find_devices_in_system (2, 3db000, 3c00, 50000, 0, 3d9400) + 38 000ae630 ddl_scan_devices (3fc688, 654210, 0, 0, 0, 3fc400) + 128 000ae4f4 req_scan_disks (660d68, 44fde8, 0, 654210, ffffffec, 3fc400) + 18 00167958 request_loop (1, 44fde8, 3eb2e8, 1800, 19bc, 1940) + bfc 0012e1e8 main (3d8000, ffbffcd4, ffffffff, 42b610, 0, 33bb7c) + f2c 00059028 _start (0, 0, 0, 0, 0, 0) + 108 Thanks, Marcel1.7KViews0likes1CommentNew Infoscale v7 Installation - Can't add Hosts
This is a new system of infoscale v7 on RHEL v6. Bidirectional port 5634 is open between the Infoscale management Server (RHEL) and host. (solaris 10 -sparc). Also one way port 22 is open from mgmt server to managed host. Host has VRTSsfmh running and listening on port 5634: solvcstst01:/etc/ssh {root}: ps -ef|grep xprtld root 3893 1 0 Mar 01 ? 0:47 /opt/VRTSsfmh/bin/xprtld -X 1 /etc/opt/VRTSsfmh/xprtld.conf root 7477 24284 0 08:28:34 pts/1 0:00 grep xprtld I've allowed (temporary) direct root login from the mgmt server to the managed host and entered those credentials. Error when adding host from infoscale server: "Registration with Management Server failed" Error log: Add Host Log ------------ Started [04/12/2016 08:30:23] [04/12/2016 08:30:23] [solvcstst01.vch.ca] type rh solvcstst01.vch.ca cms [04/12/2016 08:30:23] [solvcstst01.vch.ca] creating task for Add host [04/12/2016 08:30:24] [solvcstst01.vch.ca] Check if MH is pingable from MS and get vital information from MH [04/12/2016 08:30:24] [solvcstst01.vch.ca] Output: { "XPRTLD_VERSION" : "5.0.196.0", "LOCAL_NAME" : "solvcstst01.vch.ca", "LOCAL_ADDR" : "139.173.8.6", "PEER_NAME" : "UNKNOWN", "PEER_ADDR" : "10.248.224.116", "LOCAL_TIME" : "1460475024", "LOCALE" : "UNKNOWN", "DOMAIN_MODE" : "FALSE", "QUIESCE_MODE" : "RUNNING", "OSNAME" : "SunOS", "OSRELEASE" : "5.10", "CPUTYPE" : "sparc", "OSUUID" : "{00020014-4ffa-b092-0000-000084fbfc3f}", "DOMAINS" : { } } [04/12/2016 08:30:24] [solvcstst01.vch.ca] Return code: 0 [04/12/2016 08:30:24] [solvcstst01.vch.ca] Checking if MH version [5.0.196.0] is same or greater than as that of least supported MH version [5.0.0.0] [04/12/2016 08:30:24] [solvcstst01.vch.ca] ADD_HOST_PRECONFIG_CHK [04/12/2016 08:30:24] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_PRECONFIG_CHK","STATE":"SUCCESS","PROGRESS":1}} [04/12/2016 08:30:24] [solvcstst01.vch.ca] retrieving Agent password [04/12/2016 08:30:24] [solvcstst01.vch.ca] ADD_HOST_INPUT_PARAM_CHK [04/12/2016 08:30:24] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INPUT_PARAM_CHK","STATE":"SUCCESS","PROGRESS":6}} [04/12/2016 08:30:24] [solvcstst01.vch.ca] user name is "root" [04/12/2016 08:30:24] [solvcstst01.vch.ca] ADD_HOST_CONTACTING_MH [04/12/2016 08:30:24] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_CONTACTING_MH","STATE":"SUCCESS","PROGRESS":20}} [04/12/2016 08:30:25] [solvcstst01.vch.ca] Output: HTTP/1.1 302 OK Status: 307 Moved Location: /admin/htdocs/cs_config.htm [04/12/2016 08:30:25] [solvcstst01.vch.ca] Return code: 768 [04/12/2016 08:30:25] [solvcstst01.vch.ca] Checking to see if CS is reachable from MH [04/12/2016 08:30:25] [solvcstst01.vch.ca] Output: { "XPRTLD_VERSION" : "7.0.0.0", "LOCAL_NAME" : "lvmvom01.healthbc.org", "LOCAL_ADDR" : "10.248.224.116", "PEER_NAME" : "solvcstst01.vch.ca", "PEER_ADDR" : "139.173.8.6", "LOCAL_TIME" : "1460475025", "LOCALE" : "en_US.UTF-8", "DOMAIN_MODE" : "TRUE", "QUIESCE_MODE" : "RUNNING", "OSNAME" : "Linux", "OSRELEASE" : "2.6.32-573.22.1.el6.x86_64", "CPUTYPE" : "x86_64", "OSUUID" : "{00010050-56ad-1e25-0000-000000000000}", "DOMAINS" : { "sfm://lvmvom01.healthbc.org:5634/" : { "admin_url" : "vxss://lvmvom01.healthbc.org:14545/sfm_admin/sfm_domain/vx", "primary_broker" : "vxss://lvmvom01.healthbc.org:14545/sfm_agent/sfm_domain/vx" } } } [04/12/2016 08:30:25] [solvcstst01.vch.ca] Return code: 0 [04/12/2016 08:30:25] [solvcstst01.vch.ca] CS host (lvmvom01.healthbc.org) is resolvable [04/12/2016 08:30:25] [solvcstst01.vch.ca] Trying to figure out if host is already part of the domain [04/12/2016 08:30:25] [solvcstst01.vch.ca] ADD_HOST_SEND_CRED_MH [04/12/2016 08:30:25] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_SEND_CRED_MH","STATE":"SUCCESS","PROGRESS":30}} [04/12/2016 08:30:26] [solvcstst01.vch.ca] Output: SUCCESS [04/12/2016 08:30:26] [solvcstst01.vch.ca] Return code: 0 [04/12/2016 08:30:28] [solvcstst01.vch.ca] push_exec command succeeded [/opt/VRTSsfmh/bin/getvmid_script] [04/12/2016 08:30:29] [solvcstst01.vch.ca] ADD_HOST_INIT_DISCOVERY [04/12/2016 08:30:29] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INIT_DISCOVERY","STATE":"SUCCESS","PROGRESS":75}} [04/12/2016 08:30:29] [solvcstst01.vch.ca] Executing /opt/VRTSsfmh/bin/xprtlc -u "root" -t 1200 -j /var/opt/VRTSsfmh/xprtlc-payload-x2s4xFEb -l https://solvcstst01.vch.ca:5634/admin/cgi-bin/sfme.pl operation=configure_mh&cs-hostname=lvmvom01.healthbc.org&cs-ip=10.248.224.116&mh-hostname=solvcstst01.vch.ca&agent-password=****** [04/12/2016 08:30:32] [solvcstst01.vch.ca] Waiting for output from configure_mh---- [04/12/2016 08:30:32] [solvcstst01.vch.ca] ADD_HOST_INIT_DISCOVERY [04/12/2016 08:30:32] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INIT_DISCOVERY","STATE":"SUCCESS","PROGRESS":80}} [04/12/2016 08:30:33] [solvcstst01.vch.ca] Waiting for output from configure_mh---- [04/12/2016 08:30:33] [solvcstst01.vch.ca] ADD_HOST_INIT_DISCOVERY [04/12/2016 08:30:33] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INIT_DISCOVERY","STATE":"SUCCESS","PROGRESS":80}} [04/12/2016 08:30:45] [solvcstst01.vch.ca] Waiting for output from configure_mh---- [04/12/2016 08:30:45] [solvcstst01.vch.ca] ADD_HOST_INIT_DISCOVERY [04/12/2016 08:30:45] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INIT_DISCOVERY","STATE":"SUCCESS","PROGRESS":80}} [04/12/2016 08:30:56] [solvcstst01.vch.ca] Waiting for output from configure_mh---- [04/12/2016 08:30:56] [solvcstst01.vch.ca] ADD_HOST_INIT_DISCOVERY [04/12/2016 08:30:56] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INIT_DISCOVERY","STATE":"SUCCESS","PROGRESS":80}} [04/12/2016 08:30:56] [solvcstst01.vch.ca] Waiting for output from configure_mh---- [04/12/2016 08:30:56] [solvcstst01.vch.ca] ADD_HOST_INIT_DISCOVERY [04/12/2016 08:30:56] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INIT_DISCOVERY","STATE":"SUCCESS","PROGRESS":80}} [04/12/2016 08:30:57] [solvcstst01.vch.ca] Waiting for output from configure_mh---- [04/12/2016 08:30:57] [solvcstst01.vch.ca] ADD_HOST_INIT_DISCOVERY [04/12/2016 08:30:57] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INIT_DISCOVERY","STATE":"SUCCESS","PROGRESS":80}} [04/12/2016 08:30:57] [solvcstst01.vch.ca] Waiting for output from configure_mh---- [04/12/2016 08:30:57] [solvcstst01.vch.ca] ADD_HOST_INIT_DISCOVERY [04/12/2016 08:30:57] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INIT_DISCOVERY","STATE":"SUCCESS","PROGRESS":80}} [04/12/2016 08:30:57] [solvcstst01.vch.ca] Waiting for output from configure_mh---- [04/12/2016 08:30:57] [solvcstst01.vch.ca] ADD_HOST_INIT_DISCOVERY [04/12/2016 08:30:57] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INIT_DISCOVERY","STATE":"SUCCESS","PROGRESS":80}} [04/12/2016 08:30:58] [solvcstst01.vch.ca] Waiting for output from configure_mh---- [04/12/2016 08:30:58] [solvcstst01.vch.ca] ADD_HOST_INIT_DISCOVERY [04/12/2016 08:30:58] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":0,"ERROR":"Success","NAME":"add_host","OUTPUT":"ADD_HOST_INIT_DISCOVERY","STATE":"SUCCESS","PROGRESS":80}} [04/12/2016 08:30:58] [solvcstst01.vch.ca] fancy_die [04/12/2016 08:30:58] [solvcstst01.vch.ca] CONFIGURE_MH_REG_FAILED [04/12/2016 08:30:58] [solvcstst01.vch.ca] {"JOB":{"RETURNCODE":-1,"ERROR":"CONFIGURE_MH_REG_FAILED","NAME":"job_add_host","OUTPUT":"","STATE":"FAILED","PROGRESS":100}}{"RESULT":{"RETURNCODE":-1,"UMI":"V-383-50513-5760","ERROR":"CONFIGURE_MH_REG_FAILED","NAME":"add_host","TASKID":"{iHUXu2IK1ZRkTo7H}"}} [04/12/2016 08:30:58] [solvcstst01.vch.ca] fancy_deadSolved3.3KViews0likes3Comments