Strictly-speaking, the second use-case can be done outside of Exalytics, but you'll most probably need to use the TimesTen table compression feature only available within Exalytics if you want to store any significant volume of data in TimesTen. Now back to the data copy tasks. Now switch to the Connectivity Parameters tab and press the Generate button. Finally, for this stage, press the Build button above the list of execution plans with this one still selected, to create the list of ordered tasks for this new execution plan. Let's go through what's involved though, using BI Apps 7.
Uploader: | Grogore |
Date Added: | 7 September 2016 |
File Size: | 16.6 Mb |
Operating Systems: | Windows NT/2000/XP/2003/2003/7/8/10 MacOS 10/X |
Downloads: | 30961 |
Price: | Free* [*Free Regsitration Required] |
Now it's a case of either adding this subject area to an existing DAC execution plan or creating a new execution da1c1g just for this data copy subject area. In fact, given the typical size of data warehouses including the BI Apps data warehouse, it's unlikely that you'll ever get your whole dataset into TimesTen as it has to store everything in-memory, so you've then got to choose either to:.
Finally, for this stage, press the Build button above the list of execution plans with this one still selected, to create the list of ordered tasks for this new execution dac11f.
It does, however, come with some significant limitations, some of dsc11g you can work around but others you'd need to just live with. Post Load Note that the Task Phase can be changed to suit where you want the data copying to happen in a real-life project, and that you can check the Enable Incremental Data Copy checkbox if you want update just the new and changed data in your TimesTen tables - see the online docs on this feature for more details.
Now you need to create a logical data source that we'll use later to point to this physical connection. These new tasks need now to be added to an existing, or a new, subject area so that you can then include them in an execution plan.
Just store aggregates in TimesTen, which is the route Oracle went down with Exalytics Only try this with data marts or other small data warehouses Put just the "hot" data from your data warehouse into TimesTen - for example, the last couple of years data across the data warehouse, or a particular fact table and its dimensions It's this third approach that the new DAC11g is most suited to, with functionality to dac11b one or more BI Apps DW tables as "in-memory", then replicate them into TimesTen using an ETL engine actually built-in to the DAC, so there's no need to add new Informatica routines or otherwise customise your main load routine.
Adc11g the end of these steps you should have an additional physical datasource xac11g registered in the DAC console, that vac11g connects through to your TimesTen database, like this:. So how does this work, and is it a viable solution for moving your BI Apps data warehouse in-memory?
Press the Add All button to add these tasks to the right-hand list of selected tasks, then press OK. As an in-memory aggregate cache, automatically populated and managed by OBIEE's Summary Advisor, as part of the Exalytics hardware and software package As a regular database but stored in-memory, used to hold detail-level data warehouse data plus any aggregate tables you choose to add-in Strictly-speaking, the second use-case can be done outside of Exalytics, but you'll most probably need to use the TimesTen table compression feature only available within Exalytics rac11g you want to store any significant volume of data in TimesTen.
Now switch to the Connectivity Parameters tab and press the Generate button. But there's another way you can replicate data into TimesTen that doesn't involve the Summary Advisor, scripts or indeed anything dac11g of a regular BI Apps installation - it's actually the DAC, which in the 11g release has a feature for replicating tables into the TimesTen in-memory database.
BI By Pawan Adapa: DAC 11G Installation and Configuration- Part1
Once the parameter generation process completes, select the following values for the two parameters: DAC 11g executes the data dac11v using multiple Java JDBC calls that take subsets of the source table's data and copy them into the target table, and you'll most probably need to fiddle-around with these settings for your own environment, to get the best mix of concurrent data load vs. Now back to the dac1g copy tasks.
To do so, I perform the following steps: As I talked about in my blog post on indexing TimesTen tables a few weeks ago, you should dac11h this replication process with the use of the TimesTen Index Advisor, which generates further index recommendations based on the actual query workload on your system, and can dramatically improve the performance of queries against TimesTen tables compared to what you'd get with the default DAC-generated indexes.
Now we're ready to run the data copy process, and replicate these two DW's tables into the TimesTen database. To do so, again within the Execute view and with the new execution plan selected, press Run.
When the Generate Data Copy Tasks dialog is shown, select the following values:. So let's start with these two, and see where things go. In this instance, I'm going to create a new subject area for the tasks. But … what happens next, and how do you get these tables into the BI Apps repository and make them available to the BI Apps dashboards, so that they can be used in-place of the regular Oracle tables to speed up queries?
Using DAC11g to Replicate BI Apps DW Tables into TimesTen
To create the data copy tasks, I do the following: Well - you'll have to wait for the follow up…. In fact, given the typical size of data warehouses including the BI Apps data warehouse, it's unlikely that you'll ever get your whole dataset into TimesTen as it has to store everything in-memory, so you've then got to choose either to: To select tables for replicating into TimesTen, switch to the Design view, then use the Tables tab to locate the tables, and check the In Memory checkbox, like this:.
To select tables for replicating into TimesTen, switch dac11g the Design view, then use the Tables tab to locate the tables, and check the In Memory checkbox, like this: Another type of external executor that you can register though is called "Data Copy", and this is a special additional data movement feature that's actually built-in to DAC 11g and uses parallel Java processes to copy data from one table to another.
Note that the Task Phase can be changed to suit where you want the data copying to happen in a real-life project, and that you can check the Enable Incremental Data Copy checkbox if you want update just the new and changed data in your TimesTen tables - see the online docs on this feature for more details.
Let's go through what's involved rac11g, using BI Apps 7. To do so, I perform the following steps:. This connects the tasks' logical data source connections to the physical ones, including the physical TimesTen database connection I created earlier in the process.
Once the execution plan completed, you should see it listed under the Run History tab, hopefully with a success message next to it.
No comments:
Post a Comment