Patch Name: PHSS_26726 Patch Description: s700_800 10.20 OV ITO6.0X HP-UX 11.00/11.11 Agent A.06.11 Creation Date: 02/05/28 Post Date: 02/06/07 Hardware Platforms - OS Releases: s700: 10.20 s800: 10.20 Products: OpenView IT/Operations 6.0 Filesets: OVOPC-CLT.OVOPC-UX11-CLT,A.06.00 Automatic Reboot?: No Status: General Release Critical: No Path Name: /hp-ux_patches/s700_800/10.X/PHSS_26726 Symptoms: PHSS_26726: - SR: H555006719 If the agent is running as a non-root user, when the management server processes are restarted, the agent has to be restarted, else messages are buffered. - SR: 8606222554 Certain policies in VPW do not work as expected, for example: VP_WIN-WINS-FwdAllInformation VP_WIN-WINS-FwdAllWarnError VP_WIN-DHCPCl_FwdAllInfo VP_WIN-DHCPCl_FwdAllWarnError This problem can also occur for VPO during condition matching. Matching the application and object attributes is now case sensitive. For example, a message with application "TEST" is matched but application "tEST" is unmatched. - SR: 8606227840 Variables in the template default message key are not resolved for unmatched messages. - SR: 8606233602 If using a pattern like '<*.prefix>ERR<*.suffix>', the prefix variable will get assigned a wrong text if it should be empty. - SR: B555012929 If you run opcdista from commandline, you don't get any useful messages, only the internal status letters. For supportability, it would be better to have some explicit status and error reporting. - SR: B555013371 Sometimes the new scheduled action template configuration is not loaded after a distribution. Instead, the old scheduled actions are still started. - SR: B555013435 The message agent opcmsga hangs unpredictably. This is more likely to happen on systems with a very high ICMP traffic. - SR: B555013495 In Japanese environments programs using the agent APIs can fail with errors on invalid or incompatible codesets. - SR: B555013699 When running actions or applications like 'yes | something', the 'yes' process keeps running after the action finished on HPUX 11. PHSS_25539: - SR: H555006719 If the agent is running as non-root user and the management server processes are restarted, the agent has to be restarted as well, otherwise all messages are buffered. - SR: 8606213476 The distribution to nodes may hang or fail. This is more likely to happen while distributing to Windows NT/2000 nodes rather than on UNIX nodes. - SR: B555007980 Local automatic actions are started immediately, even though agent MSI is enabled in divert mode and the 'Immediate Local Automatic Action' box is not checked. - SR: B555008220 The <$MSG_TIME_CREATED> variable is not substituted in the message template. - SR: B555008838 The event correlation engine creates a 'Time cannot go backwards' error if the system is very busy. - SR: B555009745 The template default of the object field of a monitor template is not used. - SR: B555010620 Some messages are missing in the Japanese message catalog. You get a 'Cannot generate message' error. - SR: B555010955 Even if you used opcswitchuser.sh to specify a non-root user which should run the ITO agent, it will still be started as user root after a system reboot. - SR: B555010966 A message key relation containing <*> does not always match message keys correctly. This results in messages not being acknowledged when they should. - SR: B555011184 opcagt fails to start opcctla if it is started as ./opcagt and /opt/OV/bin/OpC is not in the search PATH. - SR: B555011505 1. opcecm/opceca might run in a dead lock while processing lots of ECS annotate nodes 2. opcecm/opceca might leak memory when ECS annotate nodes are used - SR: B555011594 The original message text of a logfile encapsulator message is wrong if <$LOGPATH> or <$LOGFILE> is used. - SR: B555011638 Pattern matching cannot match the new line character of multi-line messages. - SR: B555011979 Pattern matching hangs if only single byte Japanese HANKAKU KANA characters are used. - SR: B555011990 ECS event log (ecevilg) has invalid time difference to the next message which can cause the ECS simulator to hang or appear to hang when loading an event log file with such values. - SR: B553000162 After opcagt -stop, opcagt -status tells that the control agent does not run although it is running and sometimes you get the following error in the message browser: 'Ouput of kill -0 differs from internal pids-table for index (OpC30-1094)' PHSS_24917: - SR: B555010879 opctrapi aborts during template distribution if conditions with the 'Suppress Identical Output Messages' features are used. - SR: B555010899 opcdista requests distribution data from a wrong manager if there is a secondary manager with the same short hostname than the appropriate primary manager. - SR: B555010948 Nested alternatives were not handled correctly in the pattern matching algorithm, e.g. the pattern '[a|b]c|d' was handled like '[a|b|d]c'. - SR: B555010980 Traps without a SNMP variable are not matched because server patch adds an extra attribute to the template. - SR: B555011126 Agent distribution using the new Secure Shell (SSH) method introduced with the A.06.08 server patches doesn't work for HP-UX agents. Nothing is installed but you get no error message about it. The only hint is that the "Unpacking truck file /tmp/opc_tmp/opc_pkg.Z" message is not displayed during the installation. PHSS_23988: - SR: 8606180583 When the VPO agent was started manually from an MC/SG shared volume, the agent was killed upon package stop. This was because the agent used this volume as the current directory. Now the agent always starts in /tmp. This also has the side effect that any core file for the agent is written into /tmp. - SR: 8606180891 The template default for the service name is not used. - SR: 8606181988 The event interceptor doesn't forward on "forward unmatched" if a "supress unmatched" condition is used in a second template - SR: 8606182250 opcfwtmp doesn't trap bad login from CDE login. - SR: 8606182981 The ITO agent is not started after system reboot if the default runlevel is lower than 3 and you don't get any warning about that fact. - SR: B555010341 Agent sometimes does not start automatically after reboot while manual start works fine. PHSS_23821: - The event correlation process opceca (agent) / opcecm (server) might crash after processing several annotation nodes - The VPO A.06.03 patches for HP-UX and Solaris do not work as expected in firewall environments: While server port restrictions are still regarded, client-side port restrictions are ignored. PHSS_22881: - Changes were required for the security add-on product VantagePoint Advanced Security. - agent installation configure script fails to convert ITO 4 queue files: awk syntax error in swagent.log file PHSS_22012: - disk_mon.sh returns invalid values if the bdf command returns more than one line output for a filesystem (e.g. if the filesystem name exceeds its column width) - Several changes for firewall environments. For detailed information refer to the VPO Firewall Configuration White Paper version 3.0 - When executing large numbers of autoactions, some of them were staying in 'running' state. - opctrapi aborts after getting traps with unresolvable IP address. - The handling of '\' was different in the pattern definition and the "matching pattern". - if buffer file size limitation is enabled the agent may discard low-severity messages even if there is still space in the buffer file Defect Description: PHSS_26726: - SR: H555006719 When a communication to a message receiver fails, the message agent starts buffering messages. It periodically checks if a server is alive by sending it ICMP packets. If the server cannot be reached with ICMP packets, no RPC communication is attempted. Sending ICMP packets is not possible when the agent is running as a non-root user, so the sending function cannot actually send anything. Therefore we also never receive any replies and the message agent will buffer messages forever. To fix this, the internal state of the message agent is updated after we tried to send an ICMP packet if the agent is running as a non-root user. - SR: 8606222554 The condition test for the message attributes application, object and message group are always done case sensitive, therefore a message with the application "TEST" matches but "tEST" does not match. With this patch an opcinfo flag is introduced, which allows to switch between case sensitive and case insensitive check. flag : OPC_COND_FIELD_ICASE type : boolean default: FALSE By setting this to true the policies mentioned above will work. - SR: B555012929 The opcdista communicates with the opcctla process via stdin/stdout so if you run it from commandline, you only see the status letters but don't know what they mean. The new '-v' option prints more output, e.g.: $ ./opcdista -v 0 - No distribution data available. - SR: B555013435 One thread tried to read from a socket while another thread closed it. This could happen due to missing locking of global data. This data is now guarded by a mutex. - SR: B555013495 When tracing was added to the API functions a necessary NLS initialisation was not done. This problem was introduced only by the A.06.10 patches for HPUX. - SR: B555013699 Before executing the action, the signal handler function of opcacta was not reset. for SR's not listed in this section please see the list of symptoms PHSS_25539: - SR: H555006719 When a communication to the message receiver fails, the agent starts buffering messages. It periodically checks if the server is alive by sending it ICMP packets. If the server can't be reached with ICMP packets, no RPC communication is attempted. This doesn't work when the agent is running as non-root (only root is allowed to send ICMP packets); the sending function returns an OK value but does not send anything. Therefore we also never receive any replies and the message agent never goes out of the "Checking node" mode. Fix: If the agent is running as a non-root user, opcmsga immediately tries to contact the management server using RPC communication. - SR: 8606213476 While the agent receives several RPC calls, like "Start Distribution", "Execute Action" or "Set Primary Manager" in parallel, it may happen that the call results in a conflict within the control agent, which causes the NT control agent to bring a Dr. Watson window. This conflict can also occur on UNIX but the control agent does not die, rather the RPC request may fail. With this version the RPC calls which could cause conflicts are serialized. - SR: B555010955 The non-root user was added to the startup configuration file but not used. - SR: B555010966 The processing of the key relation is wrong for the log file encapsulator. The problem is that all unresolved entries followed by a resolved entry are removed. Other unresolved entries are kept as they are. - SR: B555011184 The working directory for the ITO agent was changed from /opt/OV/bin/OpC to /tmp to avoid problems if the agent is running in an MC/SG environment. - SR: B555011638 VPO could not match for for new line of multi line messages. Following changes have been made to allow this: It is now possible to use ^M (\r) as field separator. New patterns are introduced: to match any number of line breaks (UNIX style \n or NT style \r\n) and to match exactly n line breaks, for example <1/> will match exactly one line break. This change works only for sources that already can create multi line messages (for example opcmsg or NT event log), it does not allow multi line logfile encapsulation. This change requires a fix on the management server and the agent. Therefore a patch on the management server and a patch for the agent is required to use the new functionality. for SR's not listed in this section please see the list of symptoms PHSS_24917: - SR: B555010879 When freeing the allocated memory, a wrong frunction was used. - SR: B555010899 opcdista requests distribution data from a wrong manager if there is a secondary manager with the same short hostname than the appropriate primary manager because it searches the whole list in for each name it tries to match first the long then the short name. Instead it should try the long names for all systems first and only then try to match using the short names. - SR: B555010948 The grammar was changed to allow nested alternatives and process it correctly. - SR: B555011126 The SSH agent installation method is not known to the opcrinst script which should unpack the agent on the target not. Thus the opcrinst script simply does nothing. for SR's not listed in this section please see the list of symptoms PHSS_23988: - SR: 8606182250 opcfwtmp didn't handle the LOGIN_PROCESS value of the wtmprec.ut_type field of the WTMP structure, so the bad logins from CDE haven't been detected. - SR: 8606182981 The ITO agent was integrated into the systems startup process at runlevel 3 but up we didn't check the default runlevel from /etc/inittab. Now there is a check and you'll get a warning if the default runlevel is lower than 3. - SR: B555010341 When the process ID of the 'opcctla -start' is the same as of the running opcctla before the shutdown, the internal logic concluded that the agent is already running and did not start up the subprocesses. for all other defects not listed in this section please see the list of symptoms PHSS_23821: see the list of symptoms PHSS_22881: see the list of symptoms PHSS_22012: see the list of symptoms SR: H555006719 H555006719 B555013699 B555013495 B555013435 B555013371 B555012929 B555011990 B555011979 B555011638 B555011594 B555011505 B555011184 B555011126 B555010980 B555010966 B555010955 B555010948 B555010899 B555010879 B555010620 B555010341 B555010079 B555009745 B555009155 B555009152 B555008838 B555008613 B555008314 B555008220 B555007980 B555007752 B555007709 B555007602 B555007426 B555006890 B555006267 B553000162 8606233602 8606227840 8606222554 8606213476 8606182981 8606182250 8606181988 8606180891 8606180583 8606137088 Patch Files: /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/opc_pkg.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcrclchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcrdschk /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcrndchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcroschk /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcrverchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcrinst /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/ana_disk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/cpu_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/disk_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/last_logs.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/mailq_l.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/proc_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/sh_procs.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/swap_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/vp_chk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/dist_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/mondbfile.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/ssp_chk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/opcfwtmp.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/opcnprcs.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/ opc_get_ems_resource.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/actions/mailq_pr.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/actions/st_inetd.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/actions/st_syslogd.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/actions/st_mail.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/actions/dist_del.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/opcdf.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/opclpst.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/opcps.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/E10000Log.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/ssp_config.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/opc_sec_v.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/AgentPlatform what(1) Output: /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/opc_pkg.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcrclchk: HP OpenView VantagePoint A.06.11 (04/22/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcrdschk: HP OpenView VantagePoint A.06.11 (04/22/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcrndchk: HP OpenView VantagePoint A.06.11 (04/22/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcroschk: HP OpenView VantagePoint A.06.11 (04/22/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcrverchk: HP OpenView VantagePoint A.06.11 (04/22/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/install/opcrinst: HP OpenView VantagePoint A.06.11 (04/22/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/ana_disk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/cpu_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/disk_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/last_logs.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/mailq_l.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/proc_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/sh_procs.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/swap_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/vp_chk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/dist_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/mondbfile.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/ssp_chk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/opcfwtmp.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/opcnprcs.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/monitor/ opc_get_ems_resource.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/actions/mailq_pr.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/actions/st_inetd.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/actions/st_syslogd.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/actions/st_mail.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/actions/dist_del.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/opcdf.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/opclpst.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/opcps.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/E10000Log.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/ssp_config.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.06.11/RPC_DCE_TCP/cmds/opc_sec_v.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/AgentPlatform: None cksum(1) Output: 3883271667 7537467 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ opc_pkg.Z 4075029159 6708 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ install/opcrclchk 2480236020 28923 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ install/opcrdschk 1881763556 6720 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ install/opcrndchk 3681503652 6287 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ install/opcroschk 892365892 31983 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ install/opcrverchk 3289527428 105394 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ install/opcrinst 955494728 2731 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/ana_disk.sh.Z 4152004065 5979 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/cpu_mon.sh.Z 1282106665 6066 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/disk_mon.sh.Z 1212310955 5863 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/last_logs.sh.Z 3760939428 5837 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/mailq_l.sh.Z 1245222685 6016 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/proc_mon.sh.Z 2937836838 5426 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/sh_procs.sh.Z 1162838784 5896 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/swap_mon.sh.Z 260208334 5740 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/vp_chk.sh.Z 225363609 6115 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/dist_mon.sh.Z 3143496162 14370 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/mondbfile.sh.Z 3744835639 5982 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/ssp_chk.sh.Z 2903239220 14428 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/opcfwtmp.Z 1006568375 10956 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/opcnprcs.Z 290420095 19571 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ monitor/opc_get_ems_resource.Z 2925620036 2539 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ actions/mailq_pr.sh.Z 1063233944 2582 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ actions/st_inetd.sh.Z 4205279545 2591 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ actions/st_syslogd.sh.Z 2835770437 2584 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ actions/st_mail.sh.Z 725340100 6097 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/ actions/dist_del.sh.Z 535099722 326 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/cmds/ opcdf.Z 2792251766 388 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/cmds/ opclpst.Z 1276956937 403 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/cmds/ opcps.Z 2136432548 3320 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/cmds/ E10000Log.sh.Z 2996162164 3109 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/cmds/ ssp_config.sh.Z 1009505556 13184 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.06.11/RPC_DCE_TCP/cmds/ opc_sec_v.sh.Z 1641512547 6324 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/AgentPlatform Patch Conflicts: None Patch Dependencies: None Hardware Dependencies: None Other Dependencies: None Supersedes: PHSS_22012 PHSS_22881 PHSS_23821 PHSS_23988 PHSS_24917 PHSS_25539 Equivalent Patches: PHSS_26727: s700: 11.00 s800: 11.00 Patch Package Size: 7830 KBytes Installation Instructions: Please review all instructions and the Hewlett-Packard SupportLine User Guide or your Hewlett-Packard support terms and conditions for precautions, scope of license, restrictions, and, limitation of liability and warranties, before installing this patch. ------------------------------------------------------------ 1. Back up your system before installing a patch. 2. Login as root. 3. Copy the patch to the /tmp directory. 4. Move to the /tmp directory and unshar the patch: cd /tmp sh PHSS_26726 5a. For a standalone system, run swinstall to install the patch: swinstall -x autoreboot=true -x match_target=true \ -s /tmp/PHSS_26726.depot By default swinstall will archive the original software in /var/adm/sw/patch/PHSS_26726. If you do not wish to retain a copy of the original software, you can create an empty file named /var/adm/sw/patch/PATCH_NOSAVE. WARNING: If this file exists when a patch is installed, the patch cannot be deinstalled. Please be careful when using this feature. It is recommended that you move the PHSS_26726.text file to /var/adm/sw/patch for future reference. To put this patch on a magnetic tape and install from the tape drive, use the command: dd if=/tmp/PHSS_26726.depot of=/dev/rmt/0m bs=2k Special Installation Instructions: BEFORE LOADING THIS PATCH... o It provides bug fixes and enhancements for the VPO A.06.00 Management Server system. o DO NOT use this patch with older releases of ITO, for example versions A.05.00, A.05.11 or A.05.30 (A) Patch Installation Instructions ------------------------------- (A1) Install the patch, following the standard installation instructions. For backing up the system before installing a patch, you may use opc_backup(1m) NOTE: MAKE SURE THAT NO AGENT OF THE PLATFORM ADDRESSED BY THIS PATCH IS DISTRIBUTED (either from the VPO Administrator's GUI or from command line using inst.sh) WHILE RUNNING SWINSTALL. Don't be afraid of the '-x autoreboot=true' option above. There won't be a reboot due to this VPO patch. You can skip this option if you like. If you are running VPO in a MC/ServiceGuard installation: - Note, that only files on the shared disk volume at /var/opt/OV/share will be patched. Therefore install the patch on one cluster node while the shared disks are mounted. The server processes may be running during patch installation. - It is not necessary to install this patch on all cluster nodes. Even if the software inventory on the other cluster nodes will not be updated, the patched files will be available there when the shared disk is switched to them. NOTE: This patch must be installed on the VPO Management Server system, NOT on an VPO Managed Node directly. Changes will take effect on managed nodes by means of VPO Software Distribution (using 'Force Update' if there is already an agent installed on the managed node). See chapter 2 of the VPO Administrator's Reference manual for more information. (B) Patch Deinstallation Instructions --------------------------------- (B1) To deinstall the patch PHSS_26726 run swremove: NOTE: MAKE SURE THAT NO AGENT OF THE PLATFORM ADDRESSED BY THIS PATCH IS DISTRIBUTED (either from the ITO Administrator's GUI or from command line using inst.sh) WHILE RUNNING SWREMOVE. If you are running VPO in a MC/ServiceGuard installation make sure to mount the shared disks at the node and only at the node that had them mounted during patch installation. Otherwise restoration of the original files onto the shared disk will fail. # swremove PHSS_26726