Pre-Installation Setup > Adding/Editing Data Collectors
  
Version 9.2.00
Adding/Editing Data Collectors
 
To enable the Data Collector server to pass data to the Portal server, a corresponding Data Collector must be created in the Portal, along with Data Collector Policies for each of the vendor-specific enterprise objects. Data Collector Policies are specific to the type of data that is being collected; however, multiple types of policies often can be combined within one Data Collector.
The first step is to create a Data Collector. Once created, you can add enterprise object policies to it. Often one Portal Data Collector is sufficient for adding Data Collector policies for a variety of enterprise objects such as backup servers, arrays, and switches. Only Backup Manager TSM collection requires its own Data Collector.
For information about upgrading data collectors, refer to the Installation and Upgrade Guides for Windows or Linux.
To Add a Portal Data Collector
1. Select Admin > Data Collection > Collectors.
The list of currently configured Portal Data Collectors is displayed. If a Data Collector has already been created, rather than creating a new Data Collector, you may want to add your collection policies to an existing Data Collector.
2. Click Add and select Collector to create a new Data Collector. Refer to the details listed in To Edit a Portal Data Collector.
To Edit a Portal Data Collector
1. Select Admin > Data Collection > Collectors.
The list of currently configured Portal Data Collectors is displayed.
2. Select a Data Collector from the list.
3. Click Edit.
4. Enter or modify fields as necessary.
Field
Description
Sample Values
Collector Name*
Click Change to edit the unique name assigned to this Data Collector. The Data Collector will use this value for authentication purposes.
Changing the Collector ID or passcode requires manual changes to the corresponding Data Collector server. Collection will break if these corresponding changes are not made. See User ID and Passcode on the Data Collector Server.
BUEdc1
TSMdc1
HRdc1
Passcode*
Click Change to edit the passcode assigned to this Data Collector. It can be any character sequence.
Unlike other StorageConsole passwords (which are encrypted and then saved) this Data Collector passcode is not encrypted prior to saving in the StorageConsole database and may appear as clear case in certain files. It simply is intended as a “handshake” identification between the data collector and the policy.
Changing the Collector ID or passcode requires manual changes to the corresponding Data Collector server. Collection will break if these corresponding changes are not made. See User ID and Passcode on the Data Collector Server.
Password1
Short Notes
Descriptive notes associated with this Data Collector.
 
Enable SSL
Both secure (SSL) and non-secure Data Collectors can send data to the same Portal. Check this box to select the secure communication protocol (https) that the Data Collector will use.
This check box will not appear in the dialog box if SSL is not enabled in your environment. The Portal data receiver must be listening for https traffic; for example:
https://agent.mycollector.com
 
Auto-upgrade aptare.jar
Indicate if you want this configuration file upgraded automatically.
This part of the Data Collector is responsible for event and metadata processing threads. The .jar file contains the processing and parsing logic for data collection. The latest versions can be downloaded automatically and applied to the collector during upgrades. It is recommended that this setting be set to Yes.
Yes
Auto-upgrade Upgrade Manager
Indicate if you want this configuration bundle upgraded automatically.
This data collector component is responsible for managing Data Collector upgrades. The latest versions can be downloaded automatically and applied to the collector during upgrades. It is recommended that this setting be set to Yes.
Yes
User ID and Passcode on the Data Collector Server
Warning: Any Portal changes to the Data Collector Policy User ID and Passcode require manual modifications on the Data Collector server. Corresponding changes must be made to files on the Data Collector server so that it can continue to communicate with the Portal.
To change the User ID and Password on the Data Collector Server
1. Find and update the User ID and Password in each of the OS-specific files listed below. These entries are typically in the last line of a long string of configuration settings.
Windows:
\<HOME>\mbs\bin\updateconfig.bat
\<HOME>\mbs\conf\wrapper.conf
Example:
\opt\aptare\mbs\bin\updateconfig.bat
\opt\aptare\mbs\conf\wrapper.conf
Linux:
/<HOME>/mbs/bin/updateconfig.sh
/<HOME>/mbs/conf/startup.sh
Example:
/opt/aptare/mbs/bin/updateconfig.sh
/opt/aptare/mbs/conf/startup.sh
Note: Restart the Data Collector to trigger the updates.
 
 
Adding a File Analytics Shares Data Collector Policy
One of the three types of data collection that can be configured for File Analytics is collection of CIFS shares. The Data Collector will take the configuration that you specify, including the share names and credentials, and then traverse the filesystem structure to identify these shared resources on your network and collect the relevant metadata.
1. Click Add and select File Analytics Share.
2. Enter or select the parameters. Mandatory parameters are denoted by an asterisk (*):
Field
Description
Sample Value
Collector Domain
The domain of the collector to which the collector backup policy is being added. This is a read-only field. By default, the domain for a new policy will be the same as the domain for the collector. This field is set when you add a collector.
 
Policy Domain
The Collector Domain is the domain that was supplied during the Data Collector installation process. The Policy Domain is the domain of the policy that is being configured for the Data Collector. The Policy Domain must be set to the same value as the Collector Domain. The domain identifies the top level of your host group hierarchy. All newly discovered hosts are added to the root host group associated with the Policy Domain.
Typically, only one Policy Domain will be available in the drop-down list. If you are a Managed Services Provider, each of your customers will have a unique domain with its own host group hierarchy.
To find your Domain name select Admin > Hosts and Domains > Domains.
 
Domain
The domain identifies the top level of your host group hierarchy. The name was supplied during the installation process. All newly discovered hosts are added to the root host group associated with this domain. Typically, only one Domain will be available in the drop-down list.
If you are a Managed Services Provider, each of your customers will have a unique domain with its own host group hierarchy.
To find your Domain name select Admin > Hosts and Domains > Domains.
yourdomain
Name*
Enter a name that will be displayed in the list of Data Collector policies.
WindowsFiles
Schedule*
Click the clock icon to create a schedule. Every Minute, Hourly, Daily, Weekly, and Monthly schedules may be created. Relative schedules are relative to when the Data Collector is restarted. Advanced use of native CRON strings is also available.
Examples of CRON expressions:
*/30 * * * * means every 30 minutes
*/20 9-18 * * * means every 20 minutes between the hours of 9am and 6pm
*/10 * * * 1-5 means every 10 minutes Mon - Fri.
 
Shares*
Click Add to configure the CIFS shares that the collector will probe.
Note that the Import button in this window enables bulk loading of CIFS shares. See Importing the CIFS Share Configuration.
 
3. Enter or select CIFS shares configuration parameters in the File Analytics Shares window.
Field
Description
Sample Value
Host/Device*
Enter the host IP address or host name for the device that is being probed for CIFS shares. This also could be a non-host device, such as a NetApp array.
172.1.1.1
Share*
Enter the name of the CIFS share that the Data Collector will probe.
HOME
Protocol*
CIFS is currently the only option.
 
Authentication
Click either Anonymous or Use Credentials.
If you are using credentials, click Add to configure the CIFS share credentials, or select an existing credential definition and click Edit.
 
4. Enter credentials in the Credentials window.
Field
Description
Sample Value
Name*
Assign a name to identify the set of credentials that you are defining.
 
Account*
Enter the login account name used to log in to the hosts. If the policy includes a group of Windows hosts, use the Windows Domain user ID. This user ID must have administrative privileges.
For Linux hosts, super-user root privileges are required. You also could use an access control command, such as sudo, sesudo, or pbrun. If using any of these access commands, ensure that the user ID has sudo, sesudo, or pbrun privileges. Some enterprises prefer to create a new user and provide access to commands via an access control command.
root
Description
Enter a note to help identify the credential.
 
Password
Enter the password for the account
 
OS Type*
Select either Windows, Linux, or NAS
 
Windows Domain*
For Windows hosts only: Specify the Windows domain name. If the host is not a member of a domain, or to specify a local user account, use a period (.)
 
Private Key File
For Linux hosts only: If you have configured a Public/Private Key between your Data Collector server and the hosts you intend to monitor, specify the location of the Private Key file on the Data Collector Server.
 
Known Hosts File
For Linux hosts only: If you have configured a Public Key/Private Key between your Data Collector server and the hosts you intend to monitor, specify the location of the Known Hosts file on the Data Collector Server.
 
5. Click OK to close and save the configuration in each window.
Importing the CIFS Share Configuration
The import feature facilitates entry of a large number of CIFS shares. Simply paste the details in comma-separated format into the window and click OK.
Data Format:
server, share, protocol (CIFS), credential name
The Credential Names, already configured for the current Domain, are displayed at the top of the window.
 
1.  
NetApp Data Collector - File Analytics Activation
Use this option if you have NetApp storage systems for which you want to collect and profile files. This is the preferred method for CIFS collection on NetApp appliances.
Verify prerequisites (particularly, the NetApp User API role configuration) in the APTARE StorageConsole Data Collector Installation Guide for Capacity Manager.
See the APTARE StorageConsole User’s Guide.
Host Inventory - File Analytics Probe
Using Host Resources data collection, hosts are discovered and added to the Host Inventory. Once a host is listed in the inventory, it can be selected and the File Analytics probe can be configured. To access the Host Inventory to enable File Analytics probes: Admin > Data Collection > Host Inventory
Note that by design, File Analytics host resources data collection occurs via activation of the probe in the Host Inventory window in the Portal. Collection does not occur under the following circumstance:
The Validate option in the Portal’s Host Inventory window only runs a connectivity check. It does not collect File Analytics data.
File Analytics Probe Configurations by Operating System
Windows servers: A Data Collector must be running on a Windows 2008 server. A Domain Administrator ID is required when collecting file-level data for File Analytics.
Linux servers: Only Linux is supported (not HP-UX or AIX), with the following requirements:
Root user access is supported.
Non-root user access with sudo access control is supported.
Non-root user access without sudo is not supported.
Running collection with a sudo user on a Linux server requires the addition of a an access control command for the server in the Host Inventory’s Manage Access Control window:
Admin > Data Collection > Host Inventory
Also, an advanced parameter must be created: FA_USE_SUDO set to Y.
To access Advanced Parameters in the Portal, select Admin > Advanced > Parameters.
Both Windows and Linux Servers
If running collection via the checkinstall utility, verify the following:
An advanced parameter must be created: FA_HOST_VALIDATE set to Y.
To access Advanced Parameters in the Portal, select Admin > Advanced > Parameters.
Best Practices for Host Inventory File Analytics Probes
File Analytics should be configured to run daily for all hosts/servers.
Since most environments have hundreds, even thousands of hosts, it is recommended that File Analytics probes be configured in a staggered schedule so as not to overload the Data Collector server.