As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
In my current project I have reports running on the OLTP. Somewhat look like this OLTP --> Metadata tables (replication of the OLTP tables) --> Joining Metadata tables for report requirement --> reports.
Advantage of above approach is customer gets real time data and don't have to create and maintain ETL and DWH. And data set is also small.
Traditionally I have seen ETL loading Data warehouse and reports running on the reporting data base. The biggest challenge in this approach is reports are running on history data and then the performance.
Please suggest if we have any other approach when we have with IBI tool in place?
Best Regards, PrakashThis message has been edited. Last edited by: <Emily McAllister>,
If you have ad hoc users who'll be creating content using InfoAssist, the OLTP tables may be too complex. The advantage of a Datamart or ETLing the data to a dimensional model is that it makes reporting much easier. Additionally, if you have slowly changing dimensions a Datamart will track these changes with Effective Dates. If you design your dimensional models to track transactions and update them daily, the lag time you mention would be marginal since most Business Intelligence reporting is geared towards historical trends anyways. You can always allow your reports to drill across from the Datamart to the OLTP system to get the latest information, a hybrid approach where you have a dimensional model for the bulk of the reporting with drill down to OLTP might be a good alternative.
WebFOCUS 8206, Unix, Windows
Posts: 1853 | Location: New York City | Registered: December 30, 2015
Best practice is generally to not report against your transactional environment. Transactional table structures are generally optimized for writing to tables, where DWH are designed for reading. Reports will generally run faster out of a DWH because it can be configured and designed in a way thats more efficient.
Also because transactional systems need to be able to quickly write information, you don't want a long running query to prevent you from doing this.
You don't have to follow this advice. You can always report directly off of you transactional system. Nothing stops you from doing it. But I would recommend against it most of the time. If you absolutely need real time data on certain things, I would have IT write those reports and not expose them to end users. This way you can guarentee that they are efficient enough to not be a problem within your system.
Good luck.
Eric Woerle 8.1.05M Gen 913- Reporting Server Unix 8.1.05 Client Unix Oracle 11.2.0.2
Posts: 750 | Location: Warrenville, IL | Registered: January 08, 2013