How to validate ODI Repository and OBIEE web catalog

1.     ODI Repository Validation

 

1.1  Introduction

 

The Repository Consistency Checker (RCC) is an Oracle Data Integrator (ODI) utility developed by the ODI Support team to:

  • Verify the structure and data integrity of your ODI Repositories in order to guarantee their successful upgrade to ODI 10.1.3.6 or above.
  • Detect Repository data issues, such as:
    • Incoherent or inconsistent Internal Identifiers.
    • Damaged, duplicate and orphan Texts.
    • Corrupted and orphan Log messages.
    • Integration Interfaces which are not in ad-equation with the Topology.
    • Incorrect interactions between design objects.

It is highly recommended to run the RCC utility:

  • Prior to upgrading Repositories to a version compatible with ODI 10.1.3.6 or above.
    Note if your upgrade requires intermediate steps (such as, for example, an upgrade from Sunopsis v3 to ODI 11g, which requires an intermediate upgrade from Sunopsis v3 to ODI 10g, then from ODI 10g to ODI11g), you should run the RCC prior to the first upgrade operation (in the above example, run the RCC on the Sunopsis v3 Repositories).
  • To diagnose Repository data corruptions and inconsistencies.
  • Regularly for Work Repositories hosted in a Microsoft SQL Server database with multiple concurrent developers.

The Repository Consistency Checker (RCC) utility runs on Windows Operating System only.

1.2  Pre-requisites

  1. The user connected to the Operating System must have read and write privileges to the “diag_tools” directory (the RCC installations folder).
  2. Ensure that there is sufficient space on disk for the RCC to create the log, and diagnostic data files in the “diag_tools/logs” directory.
  3. While running the analysis, RCC creates several temporary tables in the database/schema that hosts the Repository that is analysed:
  4. Ensure that the database storage for the Repository (data files, log files, tablespaces… – depending on the database schema that hosts the Repository) can grow to store the temporary tables.
  5. Oracle recommends the doubling of the size of this storage prior to launching the RCC utility. Contact your DBA to check or expand available storage.
  6. When running the analysis, RCC may need to create several views in the database/schema that hosts the Repository that is analysed:
  7. Ensure that the user connected to the Repository database/schema is granted with “create view” privilege.
  8. Depending on the technology hosting the Repositories to be analysed, and prior to running the RCC utility, ensure that the needed appropriate JDBC drivers are present in the “/drivers” directory of the RCC installation.
  9. Ideally, install the RCC on a machine where you are already connecting to the ODI (Sunopsis) environment to be analysed. Elsewhere, please make sure that your Network and Firewall policies allow the communication between this machine and the server which hosts the ODI (Sunopsis) Repositories to analyse.
  10. The RCC is a Java-based application, and requires a JVM 1.4, 1.5 or 1.6 to be installed on the same machine.
  11. Ensure that all Sunopsis/ODI processes (Sunopsis/ODI graphical module or Agent) have been stopped and are not accessing or modifying the Repository content while RCC is analysing the Repository:
  12. Stop all your Sunopsis/ODI Agents including the ones launched as Services.
  13. Close any Sunopsis/ODI GUI such as Designer, Operator, Topology Manager or Security

1.3  Installation

  1. Download version 12.1.2.0.06 of the RCC (download the diag_tools_v1212006_part1.zip, and diag_tools_part2.zip  files).
  2. Create a “/diag_tools” directory.
  3. Unzip the files into the “/diag_tools” directory above.

    The “/diag_tools” directory must contain:

    • a “readme.txt” file
    • a “/bin” folder, containing the RCC executable files
    • a “/config” folder, containing a Master Repository connection configuration template (sample.properties)
    • a “/drivers” folder, containing the JDBC drivers
    • a “/lib” folder, containing RCC infrastructure data
    • a “/logs” folder

 

  1. Edit the “rccparams.bat” configuration file located in the “/bin” directory of the RCC installation, and make sure that the RCC_JAVA_HOME variable points to an existing Java 1.4 (or above) environment.
    Ex: C:\KrisMon\Upgrade\RCC Software\diag_tools\bin\rccparams.bat
  1. Depending on the technology hosting the Repositories to be analysed, copy the appropriate JDBC drivers into the “/drivers” directory of the RCC installation.

1.4  Procedure

  1. Use the existing “sample.properties” file as a template to create your own Master Repository connection file, and copy it to the “diag_tools/config”  directory.
  1. Launch the Repository Consistency Checker (RCC) environment:

diag_tools/bin/start_diag.bat

  1. Run the Repository Consistency Checker (RCC) tool:

– Open a command line prompt and go to the RCC “bin” directory.

– Run the RCC script:   – Run the RCC script:

diag_tools/bin/rcc.bat

– Follow the step-by-step instructions, and monitor the execution from the command line window.

 

 

 

 

  1. Stop the Repository Consistency Checker (RCC) environment:

diag_tools/bin/stop_diag.bat

1.5  Log files

The Repository Consistency Checker (RCC) tool creates:

 

  1.    Three XML files:

M_04.02.02.01_SANITY_TOOL.xml,

W_04.02.02.01_SANITY_TOOL.xml and

E_04.02.02.01_SANITY_TOOL.xml

  1. A “RCC_consoleYYYYMMDD.out” file, which contains the exact copy of the information displayed by the RCC during its execution.
  2. A diagnostic report, “ODI_consistency_checkYYYYMMDD.log”,  located in the “/diag_tools/logs” directory, and containing the detailed results of the RCC run.
  1. One or more data files, in the “/diag_tools/logs/DiagsYYYYMMDD” sub-folder, to be transmitted to the ODI Support team.
  2. Diagnosis tables in the database hosting the analyzed ODI Repository. These tables are prefixed by ORA_DIAG or E_, and should not be removed from the Repository database without approval from ODI support.

 

 

 

 

2.    OBIEE WebcatalogValidation

 

2.1  Update GUID’s

 

  • In Oracle BI 11g, users are recognized by their Global Unique Identifiers (GUIDs), not by their names
  • GUIDs are identifiers that are completely unique for a given user
  • Using GUIDs to identify users provides a higher level of security because it ensures that data and metadata is uniquely secured for a specific user, independent of the user name

To refresh the GUID’s, please follow the below steps.

  • Stop OPMN services
  • Update instanceconfig.xml file with below tag <UpdateAccountGUIDs>UpdateAndExit</UpdateAccountGUIDs>

Cd /app/oracle/product/obidev/instances/instance1/config/OracleBIPresentationServicesComponent/coreapplication_obips1

  • Enable parameter FMW_UPDATE_ROLE_AND_USER_REF_GUIDS as ‘YES’ in NQSconfig.INI file

cd /app/oracle/product/obidev/instances/instance1/config/OracleBIServerComponent/coreapplication_obis1

  • Restart OPMN services

 

  • Revert all the changes in instanceconfig.xml, NQSconfig.INI files by stop and starting OPMN services.

Log Message:

/app/oracle/product/obidev/instances/instance1/diagnostics/logs/OracleBIPresentationServicesComponent/coreapplication_obips1/webcatupgrade0.log

[2015-01-20T14:06:47.000+02:00] [OBIPS] [NOTIFICATION:1] [] [saw.subsystem.catalog.initialize.upgrade] [ecid: ] [tid: ] Starting to update Account GUIDs[[

File:initializecatalog.cpp

Line:1040

Location:

saw.subsystem.catalog.initialize.upgrade

saw.subsystem.catalog.initialize

saw.subsystems.catalogbootstrapper.loadcatalog

saw.webextensionbase.init

saw.sawserver

ecid:

]]

[2015-01-20T14:06:50.000+02:00] [OBIPS] [TRACE:1] [] [saw.subsystem.catalog.initialize.upgrade] [ecid: ] [tid: ] Update Account GUIDs Statistics: It took 2seconds to update 0catalog items[[

File:initializecatalog.cpp

Line:1114

Location:

saw.subsystem.catalog.initialize.upgrade

saw.subsystem.catalog.initialize

saw.subsystems.catalogbootstrapper.loadcatalog

saw.webextensionbase.init

saw.sawserver

ecid:

]]

[2015-01-20T14:06:50.000+02:00] [OBIPS] [NOTIFICATION:1] [] [saw.subsystem.catalog.initialize.upgrade] [ecid: ] [tid: ] Succeeded in updating account GUIDs from back end user population store[[

File:initializecatalog.cpp

Line:1125

Location:

saw.subsystem.catalog.initialize.upgrade

saw.subsystem.catalog.initialize

saw.subsystems.catalogbootstrapper.loadcatalog

saw.webextensionbase.init

saw.sawserver

ecid:

]]

 

2.2   Webcatalog Validation

 

It is a good idea to run catalog validation regularly to avoid accumulating inconsistent objects. This way you avoid having hundred of users simultaneously that face such problem.
Over time, inconsistencies can develop in the web catalog as links are broken, users are deleted, or NFS file system issues are encountered. These inconsistencies can eventually lead to incorrect behaviour, such as the inability to edit an agent’s recipient list, log in issues, etc. You can periodically take the production system offline and validate the catalog, to be informed of and to take corrective action on inconsistencies.

The validation process performs the following tasks:

  • Ensures that each object in the catalog is larger than zero bytes.
  • Ensures that each item in the catalog has a valid corresponding .atr file.
  • Ensures that each link in the catalog is valid.
  • Ensures that the files in the account cache are valid.
  • Ensures that all XML objects in the catalog pass schema validation.

Attempts to repair object names that were damaged by ftp programs.

  • Stop the Presentation Services
    opmnctlstopprocias-component=coreapplication_obips1
  • Create backups
    Backup the web catalog using a zip utility.
    Backup copy of the instanceconfig.xml file.
  • Add the below tag in instanceconfig.xml file

<Catalog>
<Validate>OnStartupAndExit</Validate>
<ValidateAccounts>Report</ValidateAccounts>
<ValidateHomes>Report</ValidateHomes>
<ValidateItems>Report</ValidateItems>
<ValidateLinks>Report</ValidateLinks>
</Catalog>

 

  • Start Presentation Services to run the validation according to the values that you specified in the instanceconfig.xml file.
    opmnctlstartprocias-component=coreapplication_obips1
  • Check the sawlog file whether the validation is successful. similar to the below

[OBIPS] [NOTIFICATION:1] [] [saw.security.validate] [ecid: ] [tid: ] Completed catalog validation.  Please review log.[[

File:securitymanagerloaderimpl.cpp

Line:491

Location:

saw.security.validate

saw.security.securityimpl.initialize

saw.catalog.local.loadCatalog

saw.subsystems.catalogbootstrapper.loadcatalog

saw.webextensionbase.init

saw.sawserver

ecid:

]]

  • Removed the tags mentioned above and keeping the default value

<Catalog>

<UpgradeAndExit>false</UpgradeAndExit>

</Catalog>

  • Start presentation services
  • Stop presentation services again
  • Add the below tag in instanceconfig.xml file

<ServerInstance>
<Catalog>
<Validate>OnStartupAndExit</Validate>
<ValidateAccounts>Clean</ValidateAccounts>
<ValidateHomes>Report</ValidateHomes>
<ValidateItems>Clean</ValidateItems>
<ValidateLinks>Clean</ValidateLinks>
</Catalog>
</ServerInstance>

  • Check the sawlog file with similar lines below

[OBIPS] [NOTIFICATION:1] [] [saw.security.validate] [ecid: ] [tid: ] Completed catalog validation.  Please review log.[[

[OBIPS] [TRACE:1] [] [saw.catalog.local.readObject] [ecid: ] [tid: ] Succeeded with ‘/users/weblogic/RMA Report Agent'[[

[OBIPS] [TRACE:32] [] [saw.catalog.impl.search] [ecid: ] [tid: ] Function search took 474 milliseconds.[[

[OBIPS] [NOTIFICATION:1] [] [saw.catalog.scrubacls.scrub] [ecid: ] [tid: ] Scrubbed 106 of 12987 records[[

OBIPS] [TRACE:1] [] [saw.catalog.scrubacls.timings] [ecid: ] [tid: ] Catalog scrub took 22 seconds[[

[OBIPS] [TRACE:16] [] [saw.odbc.connection.close] [ecid: ] [tid: ] SQLDisconnect returned code SQL_SUCCESS[[

[OBIPS] [TRACE:16] [] [saw.odbc.connection.destruct] [ecid: ] [tid: ] SQLFreeConnect returned code SQL_SUCCESS[[

[OBIPS] [NOTIFICATION:1] [] [saw.sawserver] [ecid: ] [tid: ] Oracle BI Presentation Services are shutting down.

 

Advertisements

OBIA Key Columns and Terminology

S.No Name/Area Definition/Description
1 Prune Days If the prune day’s parameter is set to 1 for example, the change capture process will use LAST_REFRESH_DATE – Prune days as the cut off timestamp for change capture.
2 INTEGRATION_ID Stores the primary key or the unique identifier of a record as in the source table.
3 DATASOURCE_NUM_ID Stores the data source from which the data is extracted. It is configured either in BIACM or in DAC
4 ROW_WID A sequence number generated during the ETL process, which is used as a unique identifier for the Oracle Business Analytics Warehouse.
5 ETL_PROC_WID Stores the ID of the ETL process information. Like Batch number in any EDW.

A little known fact is that this value is actually derived based on the start time of each Execution Plan run, and the number represents the number of minutes between 01/01/1970 and the start of the execution plan.

So, for example:

ETL_PROC_WID 23007865 refers to a run of an Execution Plan started on 29/09/2013 at 16:25.

As shown by the following code:

select to_date(‘19700101′,’YYYYMMDD’) + (23007865)/(24*60) from dual;

6 Primary Extract and Delete Mappings Primary extract and delete mappings allow your analytics system to determine which records are removed from the source system by comparing primary extract staging tables with the most current Oracle Business Analytics Warehouse table.

The primary extract mappings perform a full extract of the primary keys from the source system. The data only extracts the Key ID and Source ID information from the source table. The primary extract mappings load these two columns into staging tables that are marked with a *_PE suffix.

7 Documented Amount and Local Amount OBIA Facts store amounts in various currencies:

Document Currency: This is the currency of the actual transaction. For a multinational company, this can be in various currencies.

Local Currency: This is the base currency in which the accounting entries are recorded by the business. This is generally defined in the Ledger of the company.

8 DB_CR_IND Flag to indicate whether it is a Debit or Credit type transaction.
9 Mapping standards PLP : Post Load Process ETLs which perform tasks after data load mappings have run for example creating index etc

SDE : Source Dependent Extracts perform data extraction from the supported source systems (Oracle EBS, PeoplSoft etc)

SIL : Source Independent Loads perform data load from the OBAW staging layer to the OBAW target tables

UA : Universal Adapter mappings, these are flat file based data load ETLs provided for non-Oracle EBS, non-PeopleSoft & non-Siebel source systems

ORA_XXXX : Oracle EBS version supported by the SDE mappings

PSFT_XXX : PeopleSoft version supported by the SDE mappings

SBL_XXX : Siebel version supported by the SDE mappings

10 Universal Adaptor The Universal Adapter is provided to allow data from sources other than PeopleSoft, Oracle EBS, or Oracle JD Edwards, to be brought into the Multiple Calendar tables. The data from these sources must be brought in using the following CSV files.

·         file_mcal_config_g.csv – loads W_MCAL_CONFIG_G.

·         file_mcal_context_g.csv – loads W_MCAL_CONTEXT_G.

·         file_mcal_cal_d.csv – loads W_MCAL_CAL_D.

·         file_mcal_period_ds.csv – loads W_MCAL_PERIOD_DS (Staging Table).

11 Files in OBIA 7.9.6.X
  • file_group_acct_names.csv – this file specifies the group account names and their corresponding group account codes.
  • file_group_acct_codes_ora.csv – this file maps General Ledger accounts to group account codes.
  • file_grpact_fstmt.csv – this file maps Financial Statement Item Codes to group account codes.
12 DELETE_FLG To indicate whether the transaction in source has been deleted or not. By default it is ‘N’. When a transaction is deleted, it will marked as ‘Y’ through the “Primary extract and Soft delete” process.
13 Initial extract date Initial Extract Date is required when you extract data for a full load. It reduces the volume of data in the initial load.

The specified initial extract date will be used as a filter on the creation date of the transaction data in the selected full extract mapping.

OBIEE – Sum Distinct at Grand Total

Objective

The objective of the report is to provide Order Amount repeated for the Order Number but the Grand Totals and Sub Totals should show Sum of Distinct measure values. Here the issue is we don’t have Sum Distinct as Aggregation rule in OBIEE Analytics.

Implementation

The one solution is either use Sum Distinct function on the measure column by removing the aggregation rule set in BMM layer on that measure column and set the Aggregation rule to Server Complex Aggregate in Analytics. But this will be the against RPD rules.

The other way is to use Sum(Avg(<Measure Name> by Key columns) by Key columns)

Ex: SUM(AVG(IFNULL(“Fact – Order”.”Order Amount”, 0) BY “Order”.”Order Number”) BY “Order”.”Order Number”)

Here we cant use the Max or Min or First or Last type of Aggregation rules, because it will show wrong values if the amounts are different for the same contract number.

While trying to export custom folders through Smart Export methodology in ODI designer, the export getting failed with below error. ODI-10207: Uncategorized exception during transaction system usage

Increase the size of the JAVA MAX HEAP in the ide.conf file which is found in the :
\oracledi\client\ide\bin (local machine)
You should see something like : AddVMOption -Xmx640M
Increase the value to : AddVMOption -Xmx1280M
or greater and retest the smart export operation.
Also Please append the following to the second and third line of your odi.conf file (\oracledi\client\odi\bin (local machine))
AddVMOption -Xms128m
AddVMOption -Xmx10240m

OBIEE 11g: Presentation Services not starting

OBIEE Presentation services not starting after the deployment. Even if they are tried to start individually through unix.

Solution: –

Check whether sawserver running with orabi user (ps -ef| grep orabi)
it will show some thing like 
orabi 42271014 1 0 Oct 13 – 0:00 /oracle/product/ofm/Oracle_BI1/bifoundation/web/bin/sawserver
Kill that PID (kill -9 <PID No>)

Thanks,

Krishna