Patch Name: PHSS_25532 Patch Description: s700_800 10.20 OV ITO5.3X HP-UX 11.00/11.11 Agent A.05.57 Creation Date: 02/08/21 Post Date: 02/09/17 Hardware Platforms - OS Releases: s700: 10.20 s800: 10.20 Products: OpenView IT/Operations 5.3 Filesets: OVOPC-CLT.OVOPC-UX11-CLT,A.05.30 Automatic Reboot?: No Status: General Release Critical: No Path Name: /hp-ux_patches/s700_800/10.X/PHSS_25532 Symptoms: PHSS_25532: - SR: H555006719 If the agent is running as a non-root user, when the management server processes are restarted, the agent has to be restarted, else messages are buffered. - SR: 8606180891 The template default for the service name is not used. - SR: 8606181988 The trap interceptor does not forward on "forward unmatched" if a "suppress unmatched" condition is used in a second template. - SR: 8606182250 The opcfwtmp tool does not trap bad login from CDE login. - SR: 8606182275 A new opcinfo variable can be used to disable the message agent sending an ICMP packet before connecting to the message receiver. - SR: B555007980 Local automatic actions are started immediately, even though the agent MSI is enabled in divert mode and the Immediate Local Automatic Action box is not checked. - SR: B555009155 The event correlation process opceca (agent) / opcecm (server) might crash after processing several annotation nodes. - SR: B555009745 The template default of the object field of a monitor template is not used. - SR: B555010341 The agent may not start automatically after a reboot, but starting the agent manually works without a problem. - SR: B555010620 Some messages are missing in the Japanese message catalogue. You get a "Cannot generate message" error. - SR: B555010948 Nested alternatives were not handled correctly in the pattern matching algorithm, e.g. the pattern "[a|b]c|d" was handled like "[a|b|d]c". - SR: B555010980 Traps without an SNMP variable are not matched because the server patch adds an extra attribute to the template. - SR: B555011505 1. opcecm/opceca might run in a dead lock while processing lots of ECS annotate nodes. 2. opcecm/opceca might leak memory when ECS annotate nodes are used. - SR: B555011594 The original message text of a logfile encapsulator message is wrong if <$LOGPATH> or <$LOGFILE> is used. - SR: B555011638 The pattern matching cannot match the new line character(s) of multiline messages. - SR: B555011979 The pattern matching hangs if only single byte Japanese HANKAKU KANA characters are used. - SR: B555011990 The ECS event log (ecevilg) has an invalid time difference to the next message which can cause the ECS simulator to hang or appear to hang when loading an event log file with such values. - SR: B555013435 The message agent opcmsga hangs unpredictably. This is more likely to happen on systems with a very high ICMP traffic. PHSS_22738: - ITO Variable $MSG_ID not properly passed by ITO Message Text Interface - The command "opcagt -kill" sometimes doesn't kill all agent processes, mostly only opcctla is killed. - The IT/O Agent API call opcagtmsg_send() might leak memory if the opcmsgi isn't running. - The IT/O Agent API call opcagtmsg_send() might leak memory - The opctrapi might abort with OpC30-104 - ITO Agent might hang during ITA synchronisation when transferring bigger files (>0. 5 MB) from several nodes to the ITO server at the same time - In case of an error the ITO Agent might hang during startup without terminating - opctrapi might abort after getting traps with unresolvable IP address. - Use ECS runtime of ECS patches PHSS_22047, PHSS_22048 or PSOV_02769 to get the fixes of the patches. - Event correlation engine creates "Time cannot go backwards" error if the system is very busy. - When processes received synchronous signals (such as SIGABRT, SIGSEGV, SIGBUS), they sometimes looped at 100% CPU instead of writing a core file. PHSS_22006: - The opcfwtmp tool does not always report bad logins. - Setting up a logfile template that uses a dynamic path definition, like: <`echo "/tmp/path with spaces/logfile1.txt"`> This path will be handled as three logfiles: - /tmp/path - with - spaces/logfile1.txt - RPC template distribution fails if too many ( > ~70) templates are distributed. - In NAT environments the distribution does not work when an IP address is set for the system that does not physically exists on any of its network cards. PHSS_21068: - agent installation to HP-UX 11.10 systems failed due to dependency errors - filtering ITO internal messages (OPC_INT_MSG_FLT TRUE in the opcinfo file) causes agent aborts if tracing is switched on. - installation check for running DCE processes matches all processes ending with "rpcd" - swap_util monitor reports more swap space than you actually have because it does "swapinfo -dft" instead of the more appropriate "swapinfo -dftr" - agent installation has problems once the opcinfo file happens to be empty by accident PHSS_20051: - Pattern matching does not work correct on a Japanese system. - The trap interceptor opctrapi does not correctly handle the reconfig signal. - The trap interceptor opctrapi tries name resolution on segment names. - ITO-SE Java GUI is hanging, if Enterprise Edition agent is installed over ITO-SE. - The trap interceptor opctrapi dies on some large traps. Defect Description: PHSS_25532: - SR: H555006719 When a communication to a message receiver fails, the message agent starts buffering messages. It periodically checks if a server is alive by sending it ICMP packets. If the server cannot be reached with ICMP packets, no RPC communication is attempted. Sending ICMP packets is not possible when the agent is running as a non-root user, so the sending function cannot actually send anything. Therefore we also never receive any replies and the message agent will buffer messages forever. To fix this, the internal state of the message agent is updated after we tried to send an ICMP packet if the agent is running as a non-root user. - SR: 8606182250 The opcfwtmp tool did not handle the LOGIN_PROCESS value of the wtmprec.ut_type field of the WTMP structure, so the bad logins from CDE have not been detected. - SR: 8606182275 In order to avoid any communication other than RPC the opcinfo variable: OPC_RPC_ONLY can be used. That is a boolean value where the default is set to FALSE. - SR: B555010341 When the process ID of the "opcctla -start" is the same as of the running opcctla before the shutdown, the internal logic concluded that the agent is already running and did not start up the subprocesses. - SR: B555010948 The grammar was changed to allow nested alternatives and process it correctly. - SR: B555011638 The pattern matching could not match the new line character(s) of multi line messages. The following changes have been made to allow this: It is now possible to use ^M (\r) as a field separator. A new pattern was introduced to match any number of line breaks (UNIX style \n or NT style \r\n). will match exactly n line breaks, for example <1/> will match exactly one line break. This change works only for sources that already can create multi line messages (for example opcmsg or NT event log), it does not allow multi line logfile encapsulation. This change requires changes for management server and agent. Therefore a patch for the management server and a patch for the agent is required to use the new functionality. - SR: B555013435 One thread tried to read from a socket while another thread closed it. This could happen due to missing locking of global data. This data is now guarded by a mutex. for SR's not listed in this section please see the list of symptoms PHSS_22738: check the list of the symptoms PHSS_22006: check the list of the symptoms PHSS_21068: - agent installation to HP-UX 11.10 systems failed due to dependency errors Resolution: removed the static dependency to the DCE-KT-Tools product and added a check that is done only if the agent is installed on HP-UX 11.00 - filtering ITO internal messages (OPC_INT_MSG_FLT TRUE in the opcinfo file) causes agent aborts if tracing is switched on. Resolution: changed memory handling in the code PHSS_20051: - Pattern matching does not work correct on a Japanese system. - The trap interceptor opctrapi does not correctly handle the reconfig signal. - The trap interceptor opctrapi tries name resolution on segment names. - ITO-SE Java GUI is hanging, if Enterprise Edition agent is installed over ITO-SE. - The trap interceptor opctrapi dies on some large traps. Resolution: - Fixed bug in the multi byte character set handling. - Check for reconfig each time a trap comes in. - No name resolution is performed on segment names. They are handled as if name service returned not resolvable. - Increased buffer size for trap formatting. - The ITO-SE functionality in the Enterprise Edition agent was fixed. SR: R555004827 R555004825 R555004824 R555001968 H555006719 H555003505 H555003466 H555002653 H555002616 H555002499 H555002486 B555013435 B555011990 B555011979 B555011638 B555011594 B555011505 B555010980 B555010948 B555010620 B555010341 B555009745 B555009155 B555008923 B555008838 B555008660 B555008314 B555007980 B555007898 B555007439 B555007233 B555007187 B555006381 B555006098 B555006078 B555004841 8606182275 8606182250 8606181988 8606180891 8606153194 8606141434 8606113695 Patch Files: /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/opc_pkg.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcrclchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcrdschk /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcrndchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcroschk /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcrverchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcrinst /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/ana_disk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/cpu_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/disk_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/last_logs.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/mailq_l.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/perf_alxp.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/proc_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/sh_procs.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/swap_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/vp_chk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/mwa_read.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/dist_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/mondbfile.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/opcfwtmp.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/opcnprcs.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/ opc_get_ems_resource.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/mailq_pr.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/st_inetd.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/st_mail.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/pv.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/dist_del.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/itogpm.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/cmds/opcdf.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/cmds/opclpst.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/cmds/opcps.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/cmds/opc_sec_v.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/AgentPlatform what(1) Output: /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/opc_pkg.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcrclchk: HP OpenView IT/Operations A.05.57 (03/18/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcrdschk: HP OpenView IT/Operations A.05.57 (03/18/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcrndchk: HP OpenView IT/Operations A.05.57 (03/18/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcroschk: HP OpenView IT/Operations A.05.57 (03/18/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcrverchk: HP OpenView IT/Operations A.05.57 (03/18/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/install/opcrinst: HP OpenView IT/Operations A.05.57 (03/18/02) /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/ana_disk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/cpu_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/disk_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/last_logs.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/mailq_l.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/perf_alxp.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/proc_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/sh_procs.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/swap_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/vp_chk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/mwa_read.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/dist_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/mondbfile.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/opcfwtmp.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/opcnprcs.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/monitor/ opc_get_ems_resource.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/mailq_pr.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/st_inetd.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/st_mail.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/pv.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/dist_del.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/actions/itogpm.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/cmds/opcdf.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/cmds/opclpst.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/cmds/opcps.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/A.05.57/RPC_DCE_TCP/cmds/opc_sec_v.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/ hp-ux11/AgentPlatform: None cksum(1) Output: 604350587 7073789 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ opc_pkg.Z 1167495638 6719 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ install/opcrclchk 1804791963 28928 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ install/opcrdschk 2256119237 6667 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ install/opcrndchk 3277914771 6300 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ install/opcroschk 265811717 32053 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ install/opcrverchk 2140871485 109594 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ install/opcrinst 3381589162 2569 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/ana_disk.sh.Z 2012917748 5953 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/cpu_mon.sh.Z 3365220110 5962 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/disk_mon.sh.Z 2038898838 5834 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/last_logs.sh.Z 1657419677 5813 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/mailq_l.sh.Z 1734906432 3166 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/perf_alxp.sh.Z 1270791950 5975 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/proc_mon.sh.Z 880646978 5415 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/sh_procs.sh.Z 1324883417 5896 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/swap_mon.sh.Z 1232149950 5663 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/vp_chk.sh.Z 1141243706 5844 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/mwa_read.sh.Z 1486906764 6105 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/dist_mon.sh.Z 3385972961 14089 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/mondbfile.sh.Z 4193196958 12204 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/opcfwtmp.Z 97173376 10491 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/opcnprcs.Z 1752092164 15399 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ monitor/opc_get_ems_resource.Z 1808082801 2530 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ actions/mailq_pr.sh.Z 1628473190 2570 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ actions/st_inetd.sh.Z 4067365058 2584 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ actions/st_mail.sh.Z 1269993954 5767 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ actions/pv.sh.Z 427323140 6096 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ actions/dist_del.sh.Z 1407403240 1376 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/ actions/itogpm.sh.Z 2594235347 324 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/cmds/ opcdf.Z 2521768616 386 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/cmds/ opclpst.Z 2630350664 400 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/cmds/ opcps.Z 198648288 13214 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/A.05.57/RPC_DCE_TCP/cmds/ opc_sec_v.sh.Z 1409389308 6305 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/hp/pa-risc/hp-ux11/AgentPlatform Patch Conflicts: None Patch Dependencies: None Hardware Dependencies: None Other Dependencies: None Supersedes: PHSS_20051 PHSS_21068 PHSS_22006 PHSS_22738 Equivalent Patches: ITOSOL_00125: sparcSOL: 2.6 2.7 PHSS_25533: s700: 11.00 s800: 11.00 Patch Package Size: 7380 KBytes Installation Instructions: Please review all instructions and the Hewlett-Packard SupportLine User Guide or your Hewlett-Packard support terms and conditions for precautions, scope of license, restrictions, and, limitation of liability and warranties, before installing this patch. ------------------------------------------------------------ 1. Back up your system before installing a patch. 2. Login as root. 3. Copy the patch to the /tmp directory. 4. Move to the /tmp directory and unshar the patch: cd /tmp sh PHSS_25532 5a. For a standalone system, run swinstall to install the patch: swinstall -x autoreboot=true -x match_target=true \ -s /tmp/PHSS_25532.depot By default swinstall will archive the original software in /var/adm/sw/patch/PHSS_25532. If you do not wish to retain a copy of the original software, you can create an empty file named /var/adm/sw/patch/PATCH_NOSAVE. WARNING: If this file exists when a patch is installed, the patch cannot be deinstalled. Please be careful when using this feature. It is recommended that you move the PHSS_25532.text file to /var/adm/sw/patch for future reference. To put this patch on a magnetic tape and install from the tape drive, use the command: dd if=/tmp/PHSS_25532.depot of=/dev/rmt/0m bs=2k Special Installation Instructions: BEFORE LOADING THIS PATCH... o You may use this patch with the following official ITO release: A.05.30 o DO NOT use this patch with older releases of ITO, for example versions A.05.00 or A.05.11. o This patch provides the HP OV IT/Operations Intelligent Agent to be used with a ITO A.05.30 Management Server system. (A) Patch Installation Instructions ------------------------------- (A1) Install the patch, following the standard installation instructions. For backing up the system before installing a patch, you may use opc_backup(1m) NOTE: MAKE SURE THAT NO AGENT OF THE PLATFORM ADDRESSED BY THIS PATCH IS DISTRIBUTED (either from the VPO Administrator's GUI or from command line using inst.sh) WHILE RUNNING SWINSTALL. Don't be afraid of the '-x autoreboot=true' option above. There won't be a reboot due to this VPO patch. You can skip this option if you like. If you are running VPO in a MC/ServiceGuard installation: - Note, that only files on the shared disk volume at /var/opt/OV/share will be patched. Therefore install the patch on one cluster node while the shared disks are mounted. The server processes may be running during patch installation. - It is not necessary to install this patch on all cluster nodes. Even if the software inventory on the other cluster nodes will not be updated, the patched files will be available there when the shared disk is switched to them. NOTE: This patch must be installed on the ITO Management Server system, NOT on an ITO Managed Node directly. Changes will take effect on managed nodes by means of ITO Software Distribution (using 'Force Update' if there is already an agent installed on the managed node). See chapter 2 of the ITO Administrator's Reference manual for more information. (B) Patch Deinstallation Instructions --------------------------------- (B1) To deinstall the patch PHSS_25532 run swremove: NOTE: MAKE SURE THAT NO AGENT OF THE PLATFORM ADDRESSED BY THIS PATCH IS DISTRIBUTED (either from the ITO Administrator's GUI or from command line using inst.sh) WHILE RUNNING SWREMOVE. If you are running VPO in a MC/ServiceGuard installation make sure to mount the shared disks at the node and only at the node that had them mounted during patch installation. Otherwise restoration of the original files onto the shared disk will fail. # swremove PHSS_25532