Wednesday, September 19, 2012

Order Management Module - Different ways of invoking functionality

With WebSphere Commerce feature pack 4 onward,  we now have three variations of invoking the Order Management URL's. Lets look at each of them and when and how to use them considering the performance.

1. Struts Actions invoking WebSphere Commerce Command.
Invoke the struts Action configured to a Websphere commerce command, either using the HTTP/HTTPS and passing the request parameters using a GET or a POST method.

Ex: OrderItemAdd action invoking the OrderItemAdd command.

        <action
            parameter="com.ibm.commerce.orderitems.commands.OrderItemAddCmd"
            path="/OrderItemAdd" type="com.ibm.commerce.struts.BaseAction">
            <set-property property="https" value="0:1"/>
            <set-property property="authenticate" value="0:0"/>
        </action>

       Request:  https://localhost/....../OrderItemAdd?catEntryId-123&<otherparameters>

Here the request parameters are passed from a browser or a any client as name/value pair and processed directly by the OrderItemAdd command, well of course there can be struts validator framework validating the parameter being passed. 

2. SOI Actions invoking the SOI client and the Commerce Command

Invoke the SOI action configured in struts, which will further invoke the SOI client and then pass it on to the Commerce Command.

Ex: AjaxOrderChangeServiceItemAdd invoking the Component Service

        <action parameter="order.addOrderItem" path="/AjaxOrderChangeServiceItemAdd"                                                type="com.ibm.commerce.struts.AjaxComponentServiceAction">
<set-property property="authenticate" value="0:0"/>
<set-property property="https" value="0:1"/>
</action>

This action if invoked from a browser passing the name-value pair parameters, will convert them to OAGIS message first, then converts back into name-value pair and invokes the OrderItemAddCmd. So there is a additional conversion of name-value pair to OAGIS message and then back to name/value pair, as the OrderItemCmd will only understand this format. 

If invoked from browsers, this could have negative impact on the performance of Order management functionality.

This can be used if the client needs to pass the OAGIS message format into the Websphere Commerce Server.

3. REST services introduced in FEP4.

Invoke the struts action and pass the request as JSON. 

Ex: POST /wcs/resources/store/10101/cart HTTP/1.1
Host: localhost
Content-Type: application/json

{
   "orderItem": [
      {
         "productId": "10541",
         "quantity": "2.0",
         "itemAttributes": [
          {
           "attrName": "10297",
           "attrValue": "4T"
          }
         ] 
      },
      {
         "productId": "10823",
         "quantity": "3.0"
      },
      {
         "productId": "10260",
         "quantity": "1.0"
      }
   ]      
}

Here the request is first interpreted by the REST webservice, which converts the JSON string into a Map of name/value pairs and then invokes the SOI service of OrderManagement (section 2 above), then further processing by SOI and the return response is converted back into JSON.

With REST, we now have two more conversions one from JSON to name/value pair, then name/value pair to OAGIS message, then OAGIS to name/value pair.

JSON is very well used format for AJAX requests and for exposing Restful services, but do we need two level of conversion before the request is actually process. 

Friday, September 14, 2012

Importing Database Dump into Oracle - For local Development Box

One of the most important steps to have a healthy and accurate development and unit testing on local machines is to have most accurate data as similar as the data on TEST or Stage environments. Importing the data on to local database installations has be a regular activity preferably once for iteration or a release. 

This section explains the process of importing the oracle data dumps on to the local environment.
Assumptions: Data has been exported into .dmp format with all the tables in it, well yes some of  PII data need not be dumped as is, but should be altered to make all passwords the same, email id to be changed and address information to be changed to read same values. Alternatively have only non PII data dump or use no rows option when dumping PII tables. This article talks about approaches of importing a full or clean dump and a partial dump.

First lets look at the partial dump, this is the most easy of all and single command will be able to fulfill the need.

Step 1: Identify the directory where you want the dump file to reside.               
                      select * from ALL_DIRECTORIES; 
This will give you the list of all directories mapped in Oracle. Identify the directory used by your schema. Copy the .dmp file to the directory_path location on your system.

Step 2: Run the impdp utility.

                        impdp system/<password> directory=<directory name from step1 query> dumpfile=<dump file name> logfile=<log file name> PARALLEL=8 CONTENT=ALL TABLE_EXISTS=REPLACE

Here TABLE_EXISTS=REPLACE, will replace your local table with that one from dump.

This will take some time depending on the size of the data being imported.

Now to do a clean import or to create a new schema, following steps would be required.

Step 1: Collect the required information

  • Directory to dump the file. Your dump file should in path returned by the query.
              select * from ALL_DIRECTORIES  -- get the directory name and path
  • Get the tablespace name used by your schema. Following queries will help you identify the tablespace used by your schema (schema is same as the user name in oracle)
              Select tablespace_name from dba_tablespaces;  -- This will fetch all table spaces used in Oracle.
               Select tablespace_name from all_table where table_name = '<your table name>'; -- This will fetch the table spaces used by your specific table.

  • Get the datafile location on the filesystem (although you won't be able to use the same tablespace file name again, but you can use the same path. Once you have successfully imported the database you can delete the old files)
                  select tablespace_name,file_name from dba_data_files;

 From the above queries get the tablespace name for your data and index, if proper naming convention has been used, it could be something like <schemanameDATA> and <schemanameINDX>.

Step 2: Drop tablespaces
            drop tablespace <schemanameDATA> including contents and datafiles;
            drop tablescpace <schemanameINDX> including contents and datafiles;

Step 3: Create the Tablespaces (Datafile location based on step 1, third query)


          create tablespace <tablespace name for data> datafile '<DATAFILE NAME INCLUDING LOCATION>' size           2g reuse autoextend on;
          create tablespace <tablespace name for index> datafile '<DATAFILE NAME INCLUDING LOCATION>' size 2g reuse autoextend on;


Set the size as you require, 2g is 2 Giga bytes.

Step 4: Drop the user
           drop user <schemaname/username> cascade;

Step 5: Create the user and required permissions
           create user <schemaname> identified by <password> default tablespace <tablespace name for data> temporary tablespace temp quota unlimited on <tablespace name for data> quota unlimited 
 on <tablespace name for index>

           grant connect,resource,create materialized view to <schema name>;

           grant connect,resource,create materialized view to comusr

           grant create view to <schema name>;

           grant create synonym to <schema name>;

Step 6: Now logout of the sqlplus. At the command prompt run the following command

          impdp <oracle_admin_user>/<password> full=Y directory=<directory name> dumpfile=<dumpfilename> logfile=<logfilename> REMAP_SCHEMA=<schema name of the export>:<schema name of import> CONTENT=ALL
             
          Directory Name - is the name identified in Step 1, first query

          REMAP_SCHEMA - is required if the original schema of the export is not the same as the schema into which you are importing.

If you need to create a new schema, you will follow Step 1, Step 3, Step 5, Step 6.

Note: Please ensure that your extracted dump file has entire schema, if any of the tables are excluded during the data dump creation, your database will have those table missing.