Skip to main content

SAP HANA connectivity with BI Clients


SAP HANA comes with SQL and MDX query language interfaces which allows third party data access.
MDX (multi-dimensional expressions) is a language developed by Microsoft for queries using multi-dimensional data. An MDX expression returns a multi-dimensional result set (dataset) that consists of axis data and cell data
  • SAP BusinessObjects BI has been optimized to run on SAP HANA to provide a full suite of business intelligence capabilities for any BI use case scenario.
  • Front end third party applications or BI clients will need to go through an official certification program to get certified & supported on top of SAP HANA




Connectivity Options
BI clients can leverage ODBC, JDBC and ODBO (for MDX requests) drivers in HANA for reporting.
And HANA supports BICS interface
Business Intelligence Consumer Services (BICS) is SAP's proprietary interface for bex queries. Bex analyzer uses BICS

Comments

  1. After reading this blog i very strong in this topics and this blog really helpful to all. Big Data Hadoop Online Course Hyderabad

    ReplyDelete
  2. Playcoin Casino » No Deposit Bonus Codes 2021
    Playcoin 바카라사이트 is an online casino that was founded in 2014 and was established 인카지노 in 2014. Playcoin.com is an online casino 메리트카지노 that offers a great collection of casino games.

    ReplyDelete

Post a Comment

Popular posts from this blog

Data virtualization

Data virtualization is a process of offering a data access interface that hides the technical aspects of stored data, such as location, storage structure, API, access language, and storage technology. Analogous to concept of views in databases Data virtualization tools come with capabilities of  data integration, data federation, and data modeling Requires more memory caching Can integrate several data marts or data warehouses through a  single data virtualization layer This concept and software is a subset of data integration and is commonly used within business intelligence, service-oriented architecture data services, cloud computing, enterprise search, and master data management. Composite, Denodo, and Informatica are the largest players in the area of data virtualization References for definition: http://www.b-eye-network.com/view/14815

Difference between server jobs and parallel jobs in Datastage

Server job stages do not have in built partitioning and parallelism mechanism for extracting and loading data between different stages. To enhance the speed and performance in server jobs is to     - Enable inter process row buffering through the administrator. This helps stages  to exchange data as soon as it is available in the link.     - Using IPC stage also helps one passive stage read data from another as soon as data is available. In other words, stages do not have to wait for the entire set of records to be read first and then transferred to the next stage.    - Link partitioner and link collector stages can be used to achieve a certain degree of partitioning paralellism. All of the above features which have to be explored in server jobs are built in datastage Parallel jobs . The Px engine runs on a multiprocessor sytem and takes full advantage of the processing nodes defined in the configuration file. Both SMP and MMP archi...

Find Changed Data by computing Checksum using MD5 function in Informatica

Introduction: Capturing and preserving the state of data across time is one of the core functions of a data warehouse, but CDC can be utilized in any database or data integration tool. There are many methodologies such as Timestamp, Versioning, Status indicators, Triggers and Transaction logs exists but MD5 function outlines on working with Checksum. Overview: MD5 stands for Message Digest 5 algorithm.It calculates the checksum of the input value using a cryptographic Message-Digest algorithm 5 and returns a128-bit 32 character string of hexadecimal digits (0 - F). Advantage of using MD5 function is that, it will reduce overall ETL run time and also reduces cache memory usage by caching only required fields which are utmost necessary. Implementation Steps : Identify the ports from the source which are subjected to change. Concatenate all the ports and pass them as parameter to MD5 function in   expression transformation Map the MD5 function output to a checksum outp...