Pages

Friday, 28 June 2013

DataDirect Access your favorite SaaS app with SQL

Database driver company   are writing the next generation of data access. They are working on a platform called Datadirect which will enable you to access loads of data sources in the cloud, from the cloud, and some of them with SQL

Yes you have read correct, SQL!

All you have to do is visit their website and sign up for a trial account and access your favorite SaaS app - i.e Salesforce  or any other provided Cloud app with SQL.

Here is how their availability of connectors looks like






 










To sign up just follow this link.

I very much like this noble idea. SQL is the universal language for data manipulation and access. And things like Big Data, Hadoop in the horizon they are pro-active and seems like they are already working on it.

See some list of data sources they can/will SQL:

  • Twitter
  • Facebook
  • Hive
  • Eloqua
  • force.com
  • Salesforce
  • SQLAzure
  • Microsoft Dynamics CRM
  • and many more...

This is on my hot list of technology to watch. Well done DataDirect. 

Tuesday, 4 June 2013

Salesforce.com Real Time integration with Oracle using Informatica PowerCenter 9.5

In this post I will describe how you can integrate your Salesforce.com org with a relational database, like Oracle in real time, or better 'near' real time!

Many times I come across the requirement of quickly propagating changes from cloud platforms like Salesforce.com to on-premise data stores. You can do this with webservices, but that is not middleware and it requires coding.

How about doing with a data integration tool?

+Informatica Corporation's  Informatica PowerCenter can achieve this by using the CDC (Change Data Capture) feature of the Informatica PowerCenter Salesforce connector, when Salesforce is the source in a mapping.

The configuration is simple. All you really have to set up is 2 properties in the Mapping Tab of a Session Task in Informatica Workflow Manager.

These are the properties:
  • Time Limit property to -1
  • Flush interval property to 60 seconds (minimum 60 seconds)
See a picture from one of my settings










And here is what these two settings mean from the PowerCenter PowerExchange for Salesforce.com User Guide:

CDC Time Limit

Time period (in seconds) that the Integration Service reads changed Salesforce data. When you set the CDC Time Limit to a non-zero value, the Integration Service performs a full initial read of the source data and then captures changes to the Salesforce data for the time period you specify. Set the value to -1 to capture changed data for an infinite period of time. Default is 0. 

Flush Interval

Interval (in seconds) at which the Integration Service captures changed Salesforce data. Default is 300. If you set the CDC Time Limit to a non-zero value, the Integration Service captures changed data from the source every 300 seconds. Otherwise, the Integration Service ignores this value.

That's it, you don't have to configure anything else!

Once you set up these properties in the mapping tab of a session, save and restart the task in the workflow, the task will run continuously, non stop. The connector will poll the Salesforce org continuously and propagate any changes you do in Salesforce, downstream to the premise database system, including INSERT, UPDATE and DELETE operations.

Enjoy!

More reading:

SFDC CDC implementation in Informatica PowerCenter




Friday, 3 May 2013

UTC timestamps for Salesforce from Oracle

I came across the requirement the other day to update Salesforce every 5 minutes with data from Oracle. 

 The data in Oracle was a simple table with few columns A,B,C  and a timestamp column T indicating the last modified date/time of the record. 

To my surprise whenever I sent data changes from Oracle, mapping the columns and the timestamp to their corresponding Salesforce fields, the Salesforce field T would record the time 1 hour ahead of the wall clock time!

Quickly I realized that Salesforce, no matter where you are in the wold entering data to it, it always saves your standard / custom date/time fields in UTC (think of it as new GMT). The UI takes care of displaying your local time by working out the difference of where you are from the UTC timestamp. 

The 1 hour difference I was experiencing was because of Time-Zones and DST - Daylight Saving Time. I live in London, United Kingdom and currently the DST here says BST which is British Summer Time, and that is GMT + 1 hour. So if you modify data in Salesforce at 14:00, according to wall clocks in London, in your custom timestamp field in Salesforce it will actually record 13:00! (UTC time).

So when sending timestamps to Salesforce you have to send them in UTC!  

In case of my Oracle table A,B,C,T the date/time column was saying:

01/05/2013 17:07:20


If you send this timestamp to Salesforce as is, it will record this as a UTC value, but then because of the Salesforce UI and because of BST (GMT+1 DST) it will display:

01/05/2013 18:07:20 (1 hour ahead in the future!)


So the solution was to calculate the date/time in Oracle as UTC always, as this is the accepted value for Salesforce. To do exactly that and to provide the date/time field always in UTC no matter of DST I used the Oracle SQL CAST function to obtain the given date/time value at GMT (UTC) like this:

select to_char(cast(my_date as timestamp with local time zone) at time zone 'GMT', 'DD/MM/YYYY HH24:MI:SS') utc_my_date from my_table;


The formatting above will always return a GMT (UTC) date/time no matter which timezone or DST you are in. Then you can safely pass that as a timestamp to Salesforce and be sure that the correct time is going to be saved in Salesforce.




Thursday, 14 February 2013

SQL Access to Salesforce data

In this post I will talk about an ODBC/JDBC driver solution I discovered lately which enables you to Access your Salesforce data using the standard SQL query language.

The company which provides these drivers is called Progress|DataDirect 

Their JDBC/ODBC Salesforce Connect XE drivers, acts as translators between SQL and SOQL (The Salesforce proprietary query language). So you can write your joins, use expressions like SUBSTRING on your Salesforce data as if the data was in a relational database.

I found the concept quite interesting.  If you already know SQL -and there are many people which do- you can just query data in any Salesforce standard objects like Account, Contact and custom objects with the use of a SQL Client tool.

For my post I used SQuireL SQL Client and the JDBC driver. You can easily point your own SQL Client tool to these drivers too.  ODBC and JDBC are data access standards many tools comply with.

To get started

1. Download Progress|DataDirect Salesforce.com JDBC driver the file will be called something like this PROGRESS_DATADIRECT_CONNECT_JDBC_5.1.0.jar  ( I have downloaded the 15 day trial version)


2. Install the driver as per instructions found here and more generic info like User Guide is here

3. Configure SQuireL SQL Client to use the driver (to install SQuireL go here)

Once you have downloaded the driver and you have installed it. Start the SQuireL Client tool and register the Salsforce.com JDBC driver with SQuireL like this:

Go to Drivers tab on the right and click the + sign.
























Add a driver by using the plus sign (+) and fill in the driver details like below. You can find out the Class Name and the Website URLs, Extra Class Payt for the Progress|DataDirect Salesforce.com JDBC drivers here. You can find out more about connection properties here DataDirect Connect Series for JDBC User's Guide



Once you have configured the driver is time to add an Alias SQuireL connection to your Salesforce Org. I used my developer org below. Just follow the wizard to add the alias. It will ask you for the driver to use. Choose the Salesforce driver you have crated above. 


























Once you have created the alias (connection) is time to test your alias and connect to Salesforce with the new JDBC driver. Provide your Salesforce credentials like this:


























Once you connect then you can issue a SQL join and write any SQL statement to your Salesforce Org as below.  Below I am writing a SQL join between Account with Contact Salesforce standard objects.








What the driver really does is to translate your SQL to SOQL.

To read more about the JDBC and other drivers go to the company site Datadirect.com

There is potential here, imagine all the data profiling, data quality operations and data integrations, you can do 'in place' in SaaS and Cloud systems with SQL, without having to move the data around a lot. 

 
More resources

More blogs and resources on SQL Access to Salesforce can be found below. There is a very interesting blog post which shows you how you can access the Salesforce data from within Oracle directly here too.




Thursday, 7 February 2013

Send tweets from Oracle with OAuth

Twitter is a great real time social platform. Timelines and hashtags are a great way to communicate to an audience of subscribers relevant information.

There are lots of websites and applications which use twitter to provide content to their users. There are programming languages, Java, PHP, Python which have build Twitter API libraries to quickly send and receive content from Twitter but I haven't come accross a fully fledged library in PL/SQL, which would enable you to send a Tweet from PL/SQL using Twitter's latest oAuth security protocol. 

There are blogs out there which show you how to send a tweet from Oracle, but not many using the new oAuth security protocol.

The only blog post I have seen so far which uses PL/SQL and oAuth to send a tweet from Oracle is of Some coding hero's heroic mumblings an Oracle consultant. 

I have spend time reviewing his code with the comments made on his post which he has wrote in 2010 and managed to trim it down to use only one of his PL/SQL procedures.  The procedure below sends the 140 characters tweet  for you using oAuth. To be able to do this, as Some Coding Hero says, you will have to create a Twitter Application for your twitter handle. Actually you don't create an application you just sign up and obtain 4 security codes! Once you register your application with Twitter,  you are given 4 oAuth security Codes as follows:


With the above 4 oAuth security codes you can use Somecodingheros last block of code only to send a tweet from Oracle - I used Oracle Apex - like this:

new Twitter API 1.1 update and the use of Oracle Wallet is added to the script in green below on  21-JUN-2013

DECLARE  
  
-- Twitter API 1.1 update
  l_oauth_request_token_url CONSTANT VARCHAR2 (500) := 'https://api.twitter.com/1.1/statuses/update.json';   
--
  l_oauth_consumer_key CONSTANT VARCHAR2 (500) := 'xxxxxxxxx';  
  l_oauth_token  CONSTANT VARCHAR2 (500) := 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxx';  
  l_oauth_secret CONSTANT VARCHAR2 (500) := 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx';  
  l_oauth_nonce VARCHAR2 (500);  
  l_oauth_signature_method CONSTANT VARCHAR2 (10) := urlencode ('HMAC-SHA1');  
  l_oauth_timestamp VARCHAR2 (100);  
  l_oauth_version CONSTANT VARCHAR2 (5) := urlencode ('1.0');  
  l_oauth_consumer_secret CONSTANT VARCHAR2 (500) := 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx';  
  
  l_http_method VARCHAR2 (5) := 'POST';  
  l_oauth_base_string VARCHAR2 (2000);  
  
  l_oauth_key VARCHAR2 (500) := l_oauth_consumer_secret || '&' || l_oauth_secret ;  
    
  l_sig_mac RAW (2000);  
  l_base64_sig_mac VARCHAR2 (100);  
    
  http_req UTL_HTTP.req;  
  http_resp UTL_HTTP.resp;  
    
  l_update_send VARCHAR2(2000);  
  l_oauth_header  VARCHAR2(2000);  
    
  l_line  VARCHAR2(1024);  
    
  resp_name  VARCHAR2(256);  
  resp_value VARCHAR2(1024);  
   
-- put the tweet in the urlencode function below 
  l_content varchar2(140) := urlencode('@somecodinghero thank you');  
  
  l_random varchar2(25);  
  
BEGIN  

 -- Oracle Wallet
    utl_http.set_wallet('file:/home/app/oracle/product/11.2.0/dbhome_1/owm/wallets/oracle', 'putyourwalletpasswordhere');


 -- Get the timestamp  
  SELECT urlencode ((SYSDATE - TO_DATE ('01-01-1970', 'DD-MM-YYYY'))  * (86400)) 
  INTO l_oauth_timestamp  
  FROM DUAL;  
  
  -- RANDOM oauth_nonce  
  SELECT dbms_random.string('A',25)  
  INTO l_random  
  FROM DUAL;  
    
  SELECT urlencode (UTL_ENCODE.base64_encode(UTL_I18N.string_to_raw (l_random, 'AL32UTF8')))  
  INTO l_oauth_nonce  
  FROM DUAL;  
  
  
   
  l_oauth_base_string := l_http_method   
                          || '&'  
                          || urlencode (l_oauth_request_token_url)  
                          || '&'  
                          || urlencode ( 'oauth_consumer_key'  
                              || '='  
                              || l_oauth_consumer_key  
                              || '&'  
                              || 'oauth_nonce'  
                              || '='  
                              || l_oauth_nonce  
                              || '&'  
                              || 'oauth_signature_method'  
                              || '='  
                              || l_oauth_signature_method  
                              || '&'  
                              || 'oauth_timestamp'  
                              || '='  
                              || l_oauth_timestamp  
                              || '&'  
                              || 'oauth_token'  
                              || '='  
                              || l_oauth_token  
                              || '&'  
                              || 'oauth_version'  
                              || '='  
                              || l_oauth_version  
                              || '&'  
                              || 'status'  
                              || '='  
                              || l_content);  
                                
  DBMS_OUTPUT.put_line (l_oauth_base_string);  
    
  l_sig_mac := DBMS_CRYPTO.mac (  UTL_I18N.string_to_raw (l_oauth_base_string, 'AL32UTF8')  
                                , DBMS_CRYPTO.hmac_sh1  
                                , UTL_I18N.string_to_raw (l_oauth_key, 'AL32UTF8'));  
                                  
  DBMS_OUTPUT.put_line ('Combined sig: ' || l_oauth_key);  
                                  
  l_base64_sig_mac := UTL_RAW.cast_to_varchar2 (UTL_ENCODE.base64_encode (l_sig_mac));  
    
  DBMS_OUTPUT.put_line ('MAC Signature (Base64-encoded): ' ||  l_base64_sig_mac);  
    
  l_update_send := l_oauth_request_token_url || '?status=' || l_content;  
                     
    http_req := UTL_HTTP.begin_request (  l_update_send  
                                        , l_http_method  
                                        , UTL_HTTP.http_version_1_1);  
                                          
   DBMS_OUTPUT.put_line ('UPDATE URL ' || l_update_send);  
     
   UTL_HTTP.set_response_error_check (TRUE);  
   UTL_HTTP.set_detailed_excp_support (TRUE);  
     
   
    
    l_oauth_header := 'OAuth oauth_nonce="' || l_oauth_nonce || '", '  
                      || 'oauth_signature_method="'|| l_oauth_signature_method || '", '  
                      || 'oauth_timestamp="'|| l_oauth_timestamp || '", '  
                      || 'oauth_consumer_key="'|| l_oauth_consumer_key || '", '  
                      || 'oauth_token="' || l_oauth_token || '", '  
                      || 'oauth_signature="' || urlencode (l_base64_sig_mac) || '", '  
                      || 'oauth_version="' || l_oauth_version || '"';  
                        
    utl_http.set_header ( r => http_req,   
                          NAME => 'Authorization', VALUE => l_oauth_header);  
                            
    DBMS_OUTPUT.put_line  ('HEADER: ' || l_oauth_header);                          
                            
    utl_http.write_text(  r => http_req, DATA => l_content);   
       
    http_resp := utl_http.get_response(r => http_req);  
       
   DBMS_OUTPUT.put_line('GETTING RESPONSE HEADERS! ');  
     
   FOR i IN 1..utl_http.get_header_count(http_resp) LOOP  
    utl_http.get_header(http_resp, i, resp_name, resp_value);  
    dbms_output.put_line(resp_name || ': ' || resp_value);  
   END LOOP;  
     
  DBMS_OUTPUT.put_line('Getting content:');  
  BEGIN  
      LOOP  
        utl_http.read_line(http_resp, resp_value, TRUE);  
        dbms_output.put_line(resp_value);  
      END LOOP;  
        
      EXCEPTION  
      WHEN utl_http.end_of_body THEN  
        DBMS_OUTPUT.put_line('No more content.');  
  END;  
  
   utl_http.end_response(r => http_resp);  
   
  
  EXCEPTION  
    when others then  
      DBMS_OUTPUT.put_line('HTTP ERROR: ' || utl_http.get_detailed_sqlerrm);  
  
END;  

Thank you Somecodinghero!

Saturday, 26 January 2013

Is DELETE necessary?

A  delete is a very destructive operation for a database. An operation which requires a lot of  thinking and preparation and most of the time is irrevocable. You can get in a lot of trouble if you delete the wrong set of rows and lose all that very important data!

Maybe the DELETE command should be banned from SQL. It was probably invented when the disks were tiny and it might not be suitable for our times, especially nowadays when we talk about things like BIG DATA.

Why do we have to delete anyway? We should invalidate and age out data and never destroy it. I am one of those who would say ‘keep it’ and never lose it. Even a typo, an error while entering somebody’s name in a text field, tells us a story. The number of attempts to get it right, the number of characters typed which were wrong, the time the wrong entry took place and much more, is data. Temporal databases come to mind. Why delete it?

But, one can argue that not deleting can be dangerous, too! One can claim that by refusing to delete we might end up with a lot of old, duplicated, bad quality, uncleaned, irrelevant and untrustworthy data.
 

Maybe the worse thing that can happen to data is to refuse to delete it enough. Perhaps because of this lack of enough deletes and fear of deletes we end up with all the 'bad data' found in some  database systems today.

Whatever you choose to do, delete or not delete, one thing's for sure and that is that you should know what you are deleting or not deleting. Choosing confidently what to delete comes from understanding and knowing your data model, and the implications a delete will have on the data model and never because the data is not required for your project! 




Sunday, 6 January 2013

Salesforce basic data model

All applications store data. Data is stored in database tables - or objects in case of Salesforce.  In this post I will try to describe in simple terms how to read and Entity Relationship model or sometimes called a Data Model. I give reference to further reading at the end of the post.

Data is related with other data based on relations and associations. For example, as in Salesfoce's data model One Account can have Many Contacts. In the model these relationships/associations are indicated by drawing lines from one object to the others as seen in the diagram below, where the one side line is a single line whereas the many side of the line has multiple lines like crows feet.

Below is how this is shown in Schema Builder application of Salesforce


















This is a standard way of showing relations between entities in databases and thus objects in Salesforce.

Walking through some of the Standard Objects and their relations to other objects in Salesforce

There are 2 dozen Standard objects in Salesforce, below I have drawn with Schema Builder few core standard objects of the core CRM implementation, I list them here:















How to read the Entity Relationship (ER) diagrams

Lets start by looking at the above Entity Relationship diagram from Salsforce.

The Lead object is kind of standalone object is not related to any other objects in one way or another. There are no lines! Usually Leads are not associated with any Accounts or Contacts until they are converted into one of them.

The Account standard object is related with the Contact standard object. Follow the line from Account to Contact in the picture. The Account object can be related to other objects too, you can even relate it to Custom objects (your own objects created and added to the model). The model can be extended depending on what application you are trying to build. To read the model and understand it, you just follow the lines, starting from the Account object to the other objects and looking at the end of the lines. If you see a Crow's feet like 3 lines at the end of the line, that is  the 'many' side of the relationship and if you see single solid line it is the 'one' side of the relationship. As you see in the model Account relates to Opportunity and Quote objects too.

All you have to do is to look into one object and follow the lines coming out of that object.  A tip: is best to start looking at the relationship between objects from the 'one' side and not the 'many' side. Try on your own, check how the Product object is  related QuoteLineItem object.

The implementation of relationships between the objects is done by special fields (columns) which are  called keys. In databases are called Primary Keys and Foreign Keys. The posting/copying of these keys from one object to the other is what creates the relationships and establishes the Associations. 

Data modelling and Entity Relationship diagrams are processes used in Software Engineering to analyse business requirement and build a 'database' Information System. Salesforce is such an Information System.

For further reading on Entity Relationship diagramming you can see links below. 
A search for 'Entity Relationship Diagram' in Google will return a plethora of other information.

Salesforce done a great job with Schema Builder. To obtain a Salesforce free developer account and have a look at it and start creating objects and see relationships go to: http://developer.force.com/