Pages

Thursday, 9 April 2009

Upgrading to 11g, a technical workshop seminar by Mike Dietrich

On 7th of April I have attended the Oracle 11g upgrade technical workshop seminar given by Mike Dietrich in the London City offices of Oracle.

This was one of the best Oracle Events I have attended.

Mike run us through his 400+ slides of tips & tricks and best practices when upgrading to Oracle 11g. He also presented 3 real life cases in which he helped 3 big Oracle customers during their upgrade to Oracle 11g. The contents of the slides are very good and the presentation skills of Mike were excellent. In this post there is a link to the slides. The slides are quite big in size so you have to download them in 3 parts (about 8MB in total). Here are the slides: Oracle 11g Upgrade Workshop presentation


Workshop presentation highlights


  • Better support to the optimizer during upgrade.
  • Definitely apply the timezone patch, you can't upgrade without it.
  • Recalculate statistics before upgrade.
  • If you stay on the host use dbua.
  • If you move host use command line upgrade.
  • Provision for performance degradation, do tests, tests and tests.


It seems when it comes to upgrades in Oracle, people will always wait for R2. Mike said that Oracle is well aware of this and that with 11g they tried to break this taboo by explaining to us how Oracle actually are not introducing anything special as bug fixes in R2. People will see this themselves when they see how many little bugs will be fixed in R2.

From the presentation I also got the impression that you can upgrade to 11g, it is easy takes little time. But the performance implications the upgrade might introduce are still uncontrollable if you don't have diagnostic & tuning pack license purchased (£1777 each per CPU) . A Diagnostic packs license designed to show and sort out automatically all SQL statements which will regress! If you don't have diagnostic pack purchased, your DBA might end up dealing for days with hundreds of ''Regressed'' SQL statements because of the upgrade. Whereas the diagnostics package deals with it in hours. The DBA poor chap! (Hopefully Not!)

Thursday, 2 April 2009

"Informatica on Demand" an ETL tool running in the Cloud

Recently I was after ways which I could use to integrate (ETL) data from separate sources into the Cloud. Move data between MS SQL Server and Salesforce or MS SQL Server to Oracle for example. But mostly I was interested in an easy peasy way to migrate data from back-end database systems such as MS SQL Server into the Salesforce Cloud.

How is that possible? First thing which comes to mind is API, Java, Import & Export and custom code, code and code.

Not really, all you need is Informatica on Demand! A simple and intro level absolutely free ETL tool which is running in the Cloud!

Informatica is a very reputable company in the Data Integration field. Informatica on Demand is part of their free Software as a Service (SaaS) offering. An ETL tool which is itself in the Cloud. You don't have to install any software anywhere in your infrastructure, apart from a little Informatica agent which identifies your network to the Informatica Cloud. All you do is just create a login on their website and start working! How cool is that! You login to the Informatica on Demand website and configure and even schedule your data migration and ETL tasks between your databases and the Cloud as easy as checking your email!

This free tool does schema-to-schema migrations only. It doesn't allow you to migrate by writing custom SQL on the source window and loading its result set to the target. It is limited to table-to-table migrations between separate sources and targets. You can however, save your SQL as a view in your source database and the tool will see this as a table object.

In the same Informatica on Demand website and within the same Cloud infrastructure, they also have a suite of other more advanced tools such as Data Synchronization, Data Replication and Data Assessment. This stack of tools is not free. But a 30 day trial is still possible.

Overall, I found the tool quite easy to use and with few clicks, I could transfer thousands of rows from say Oracle to Salesforce and from Microsoft SQL Server to Oracle. No need to install and configure gateways, ODBC and no need tweak any hardware locally. It also has a scheduler which works and sends emails when tasks are complete and all this is still hosted and running in The Could. So you don't have to backup anything or look after things.

Here is a snapshot of the Field Mapping screen: