Quantcast
Channel: SAP Business Process Management
Viewing all 123 articles
Browse latest View live

Complete a task: Build OutputData

$
0
0

Hi there,

 

 

I had the problem that I was not able to complete a BPM task, because I didn't manage to fill the task's OutputData structure of the BPM oData service correctly.

After having solved the issue, I'm writing my first blog to share my solution with you.

 

 

 

Let's imagine you have a task with the same complex input/output type called "HandleRequestType":

 

<complexType name="HandleRequestType">    <sequence>          <element name="Approver" type="tns:ApproverType" maxOccurs="1" minOccurs="1">          </element>          <element name="Request" type="tns:RequestType" maxOccurs="1" minOccurs="1">          </element>          <element name="Requester" type="tns:RequesterType" maxOccurs="1" minOccurs="1">          </element>          <element name="Status" type="tns:StatusType" maxOccurs="1" minOccurs="0">          </element>    </sequence></complexType>

 

 

As you can see it consists of a lot of complex subtypes like "RequestType" or "RequesterType".

 

Let's imagine that you only want to fill two fields of this data structure:

  1. Field "Action" in element "Status"
  2. Field "ReferenceNumber" in element "Request"

 

The coding is very easy, but it took me several hours to find out how it works.

Please have a look at the completeTask() method which builds OutputData and sends a POST request to the oData service.

 

controller.js

completeTask : function() {    // Get TaskID and data model    var taskId = getValueOfURLParameter("taskId");    var odataModel = this.getView().getModel();    // Create OutputData    var outputData = {};    // Create all needed subtypes    var handleRequestType = {};    var status = {};    var request = {};    // Fill values for fields that need to be sent    status.Action = "Approved";    request.ReferenceNumber = "1234";    // Build OutputData    handleRequestType.Status = status;    handleRequestType.Request = request;    outputData.HandleRequestType = handleRequestType;    // Complete task with built OutputData    odataModel.create( "/OutputData", outputData, null,          function sendData_OnSuccess(oData, response) {              alert("Task has been completed successfully");          },          function sendData_OnError(oError) {              alert("Task could not be completed");          });
}

 

 

Keep in mind that you can build up the OutputData data structure without any binding of InputData defined on your view!

It took me some time to figure that out.

 

 

Best regards,

Thorsten.

 

P.S.: Please note, that you have to unzip the attachment "WebContent.zip.txt.zip" and remove the .txt extension. Then you can extract the "WebContent" folder. It's a little bit weird, but that's what happens when you upload a "*.zip.txt" file...


Hot off the presses, Practical Workflow for SAP

$
0
0

Those of you in the Workflow space may have seen this coming - after all, the previous edition of the 'Workflow Bible' was printed in 2009, and the first edition - the original, was printed in 2004.  And my, how things do change in the SAP world!


One thing that doesn't change is the need for businesses to optimize their processes, and this can be accomplished by using (wait for it...) SAP Business Workflow. 

 

The latest edition of the SAP-Press book will be available to purchase on August 5, with electronic versions available.  As a co-author, I was pretty excited to receive a copy (OK, two copies!) today.

 

Workflow Book-1.jpg

 

But this blog isn't really about selling the book.  Those of you who are trying to learn workflow will make up your own minds about that. 

This blog is to thank the people who made it possible - and to tell you something you may not know.

 

First off, Alan Rickayzen and Jocelyn Dart - you slaved over the first version of this book, and it was amazing.  It is incredibly well written and even portable. 

 

Secondly, Ginger Gatling, when it was time to create the second version, you rallied a team from all over the world, and you were an inspiration to future cat-herders everywhere. Not only that, but you (YOU) are the person who engineered the donation of all author proceeds to Doctors Without Borders, which has set the model for many others, including (but not limited to) Enterprise Information Management

 

Third, this book, at its mere ~1000 pages, not only represents some of the best information on the newest SAP technologies (yeah, we mention HANA, and we talk OpenUI5) it also represents the blood, sweat and tears of all of its authors - Head on over to the SAP-Press site to see who they are!  They are all passionate people and I am grateful to hold a bit of their knowledge in my hands.

 

Fourth, those of you who are Facebook-enabled, head on over here  Workflow Book | FacebookAlan Rickayzen has set up a little challenge to see how many likes the page can get (no agents will call).

 

I'd also like to thank Kelly Weaver and Laura Korslund for their outstanding work in getting this thing done.

 

Oh, and one more thing?  Even if you don't need to know about SAP Business Workflow, you can still join in the SAP community by being part of the 'I Care, I Gave, I Inspire' missions.  Doctors Without Borders is being kept too busy these days. 

UWL/XML - A beginner's perspective, part 1

$
0
0

After years, quite literally, of wishing we had Universal Worklist (UWL) at last my dream came true.  We are implementing SRM7 and the Universal Worklist (UWL).   I've been to many presentations - some given by Jocelyn Dart, Thomas Kosog, and Ginger Gatling.  I've read 'the book'.  I even found Darren Hague's book on UWL.

 

Here are some of my top 'likes' in the UWL/XML area:

 

 

Top 10 Reasons to Use Universal Worklist: Getting the Most Out of Universal Worklist

 

Introduction to Universal Worklist

 

Workitems, Universal Worklist and Web Dynpro for ABAP

 

 

Resources - Business Process Expert - SCN Wiki

 

I had all the goodies I might possible need!  I was READY.

Or so I thought.

 

The initial steps you need to take to set up the UWL are well documented and quite set.  Your Basis team will do the initial configuration - setting up the connections and all that.  What happens next is that you (presumably) the 'workflow developer' or maybe the 'ABAP programmer' are thrust into a world of text files, DTD files, little-to-no change management and a new language.     Those of you who are more webby user-interface-y savvy should probably leave now.  But for those of you who are new to Universal Worklist and XML, feel free to stick around.


I know some might say that XML is merely the markup language, and it's not even a real programming language.  I'll say.

 

What you are confronted with is a baffling number of huge files with no easily discernable rhyme or reason. There's no file management built in.  There isn't (really) a syntax check (although you'll learn more about that later on).  There are no RULES - except that the test cycles between 'Upload new configuration' and 'run' involve many steps.

 

Bearing in mind that we are implementing SRM7 (upgrading from SRM5) with our 'old' application-controlled Workflows added another level of uncertainty.  So I hope that if you read this blog (series?  We'll see) it will help you as much as some of the community people have helped me.

 

Now you are now doing a lot of your work in a web interface.  This means caches and cookies and browsers, oh my!  It will help if you are allowed to run multiple browsers on your work systems.  For some of us, certain <ahem> internet explorers are not welcome, so we are using Firefox and Chrome.  I chose to do most of my 'work' in FF, but then was dismayed to see that some of the UWL/SRM rendering is not correct in Chrome.  So simultaneously uploading changes to UWL in FF, and then logging in a  user in Chrome is not the way to go.  You can reverse that, or use that other browser too.

 

You will see, once you click on 'Click to Manage Item Types and View Definitions' a long list of configurations. 
Each of these configurations represents a portion of XML delivered by SAP for specific functionality.

 

 

And what's all that gunk below?  That's going to be your world for a while, my friend.  So get yourself off to a good start and familiarize yourself with various XML editors as well.  I am using Notepad++ and at first it made my eyes bleed, but now I am getting used to it, and I've found some good search and edit capabilities.

 

 

Now, of all these files (something like 20 of them) the ones you will want to pay attention to first are:

 

uwl.webflow     - contains the definitions of some basic workflow actions, such as 'forward' and 'upload attachments'

uwl.standard     - also contains the definitions of various actions

com.sap.pct.srm.core - contains the XML for each workflow task and the actions that are applicable for each task


Did I just say 'task'?  Got it!  Go edit that com.sap.pct.srm.core file then!  That's all I need to do!

 

So download the com.sap.pct.srm.core file.  Save the original downloaded files in a directory where you will not be tempted to overwrite them.  While you're at it, create some folders - UWL-Sandbox, UWL-Dev, UWL-QA for example.    Then open that downloaded file with the XML editor of your choice.


I have a few screen shots for you to peruse...

 

Notepad ++

 

 

Notepad:

 


Wordpad:

 

You can see that Wordpad offers the more 'Word-like' interface, while the Notepad interface is pretty black and white.  Or just black and white.  And in Notepad ++ as I said initially all the colors were baffling, but now I've adjusted.

 

At any rate, you will chose your XML editing tool, and set up folders to keep yourself organized.  First, before you make ANY CHANGES (and we're looking at com.sap.pct.srm.core now) save a copy in your 'working' folder.  Depending on the tool you are using, familiarize yourself with 'Save as' and chose XML.  Or, in Notepad++, chose file type of .TXT and add .XML onto the file name.  Why?  I don't know!  But it works.

 

If you're the workflow developer, you'll be a a little ahead of the game because you will already know about tasks.  If you're not the workflow person, become friends with them.  Bring them cookies.

 

Look at the task you expect to deliver.  If it's an SAP standard task, say, TS10008126 (SC approval) then great!  The com.sap.pct.srm.core file will contain a definition of that task and how it should be rendered.

 

Here is the delivered XML for that task:

SAP delivered XML.jpg

Since I had already replaced our old SC Approval task with this task, TS10008126, in the workflow template, I launched a few SC approval workflows.  Then I checked out their representation in the UWL.    But this what what was shown.  And you had to know enough to <gasp> right-click on the item.

 

 

This was not going to fly with my users.  In SRM5, we had buttons.  Yes, people can be trained and change is hard and all that.  But I asked some friends who were also on SRM7, and they all had the buttons!  Was this a complication because we were using Application-Controlled Workflow?  Was I using the 'wrong' task? Would swapping to a different task (and creating more shopping carts and so on and so on) make the buttons show up?

 

No.

 

The answer, as pointed out by George Credland (who graciously was volunteered to help me through our mutual contact, James Ibbotson)   was to be found in OSS Note 1803438 "Showing UWL item action as a button - only in context menu"  Wait, WHAT?  I had been combing through OSS notes since January (when I got my sandbox environment).  I know there were no notes related to buttons.  But timing is everything.  This note was published in May.  So it never existed in January, and it took George's fresh eyes to help me.



<Property name="showButtonInPreviewArea" value="yes"/>

 

 

This is where I enhanced the delivered XML to get the Approve Button.   I saved this file (carefully, never overwriting the original) with a file name that identified it as my own, and uploaded it to UWL, re-registered, and cleared the cache (I'll save all this for another blog).



 

You see, the task refers to one or more actions.  The actions are defined (at least once, but perhaps multiple times) in other XML files.  The trick is to match your task and the actions it performs with the action definitions. When we decided we needed the 'Forward' and the 'Assign to Me' actions as buttons too, the Property under each of those actions was added - to show the button in the preview area.


Now I could go on for quite a while about XML itself.  It's picky.  It doesn't like me.  It is impossible to understand.  But with a little help from my friends, I've overcome some of those hurdles. 


There's a lot more to do, and a lot more to blog about.  If I have to close this blog with some tips, they are:

1) identify the tools you need (XML editor) and get them

2) set up your file management system

3) have a cup of tea

4) do not underestimate the value of friends, or a pair of fresh eyes

5) check SAP Notes early AND often

6) Thank people when they've been helpful (that's to you, Ginger GatlingThomas KosogJocelyn DartJames Ibbotson and George Credland)


Stick around for Part 2, in which I will rant about something else.


What about you?  Did you struggle with XML at first?  What were your biggest hurdles?  How did you overcome them?


<Action name="Closing" value="Cheers"/>



  

BPM OData: Implementing a Custom Task Inbox

$
0
0

This blog post, as part 4 of BPM OData blog series, refers to the OData Service in SAP NetWeaver BPM available with SAP NetWeaver 7.3 EHP 1 SP 09 and higher. The features described in this blog post are available with SAP NetWeaver 7.3 EHP 1 SP 12 and higher. Before reading this blog post, it is recommended to read the previous parts of the BPM OData blog series.

 

 

Overview

In the previous blog posts, we were considering the following business scenario: a customer record is created, its data is verified by a customer service employee and in the end a financial specialist defines a credit limit for the created customer based on the provided data. In such a credit institution, a lot of these processes can be initiated from day to day. That means that both the customer service employee and the financial specialist need a list of the corresponding tasks to work on. Obviously, each of them should be able to see only the tasks he/she is responsible for and the tasks, which require processing. In addition, a task inbox should support the employee in organizing his/her work. It might be desirable to sort the tasks, for example, by date to ensure that the customer who initiated the process first also gets the institution's response first. Fortunately, SAP NetWeaver BPM provides a tool called BPM Inbox, which meets the aforementioned requirements, and even more. But what if you have some special requirements, which the standard BPM Inbox cannot fulfill? Don’t worry! You can now implement your own inbox. And this blog post can be used as a starting point.

 

BPM Tasks OData Service

The previous blog posts about the BPM OData service mostly described a custom task execution UI. The main information on that UI is the task data. Nevertheless, before working with the task data you see information about the task itself such as its title, status and priority. To work on a task it had to be claimed first. That was the time when the BPM Tasks OData service came into play. At that time, the service was briefly described. It supported only the receiving of information about a particular task and the claiming of a task. Now, let us describe this service in more detail because it is the functionality we will use to implement a custom inbox.

Some technical details of the BPM Tasks OData service have already been mentioned in this blog post. Just as a refresher: the BPM Tasks OData service has the name ‘tasks.svc’ and is available under ‘bpmodata’ root URL as all other BPM OData services. This means that the following pattern can be used to represent the service URLs:

 

http://<host>:<port>/bpmodata/tasks.svc/<OData_resource_path_and_query_options>


Starting from SAP NetWeaver 7.3 EHP 1 SP 12, this service provides more functionality regarding the access to task-related information and task-related operations exposed in the BPM Public API. The service provides the following set of features:

  • Access to a collection of available BPM Tasks
  • Access to a specific BPM Task
  • Claim a BPM Task
  • Release a BPM Task
  • Forward a BPM Task
  • Search for BPM end-users
  • Access to BPM Task Definitions
  • Access to custom attributes of a BPM Task
  • Access to custom actions of a BPM Task
  • Execute a custom action of a BPM Task

 

To support all the aforementioned operations, the service defines the corresponding entity model. The most important entity in this model is Task. This entity contains all the information about a task including title, status, priority, etc. Task entity has already been mentioned in this blog post when the retrieval of task metadata was described.

As you can see, the BPM Tasks service provides many features and if this blog post would describe all of them in detail, it probably would not have been published yet. Moreover, a custom inbox for the aforementioned business scenario in a credit institution can be implemented using only a sub-set of the service functionalities such as:

 

Some of these operations have already been described in the previous blog posts. Therefore we will focus only on accessing a collection of tasks and releasing a task, in this blog post. More information about the functionality that is provided by the BPM Tasks OData service, the supported URLs and the entity model can be found in the official documentation.

 

Accessing a Collection of Tasks

Usually, the main page of an inbox such as a task inbox or an e-mail inbox represents a number of the corresponding items, in our case tasks. In order to get a collection of tasks, the BPM Tasks OData service provides a TaskCollection entity set. The service response for the entity set contains Task entities corresponding to the tasks that are visible to the current user, which means the tasks, for which the current user is a potential or actual owner. Since the potential owner can see all his tasks, including canceled and completed tasks, the service response for the collection of tasks can contain a huge amount of tasks. In such a situation, it will be difficult for the user to find the tasks, which really require processing among the number of already completed tasks. Moreover, providing all the visible tasks for the user takes a lot of processing time and affects the service performance on the server. To prevent the aforementioned situation, all the requests for the TaskCollection entity set must have a $filter OData query option specified. The purpose of the query option is to specify filtering criteria to get only the tasks matching the criteria in the service response.

 

The table below shows the URL used to access a collection of tasks along with the service response:

HTTP MethodGET
URL

.../bpmodata/tasks.svc/TaskCollection?$filter=Status eq 'READY' or Status eq 'RESERVED'

&$orderby=CreatedOn desc&$format=json

Response Body

(simplified)

{
-d: {  -results: [   -{    +__metadata: { … },    TaskDefinitionName: "Verify Customer Data",    TaskTitle: "Verify Customer Data",    Priority: "HIGH",    Status: "READY",    CreatedOn: "/Date(1409673092830)/",    CreatedBy: "Administrator",    CreatedByName: "Administrator",    Processor: "",    StartDeadLine: "/Date(1409674893053)/",    CompletionDeadLine: "/Date(1409675193053)/",    ExpiryDate: "/Date(1409676693053)/",    IsEscalated: false,    SupportsClaim: true,    SupportsRelease: false,    SupportsForward: true,    SupportsComments: true,    IsSubstituted: false,    SubstitutedUser: "",    +UIExecutionLink: { … }    +TaskDefinitionData: { … }    +CustomAttributeData: { … }    +Description: { … }    +Comments: { … }    ...
}

 

The URL given in the example above requests all ready and reserved tasks for the current user.

 

If besides filtering the returned tasks should also be ordered by a particular task attribute, the $orderby OData query option can be used. In the example, the returned tasks will be ordered by creation date in a descending order, which is the default ordering for TaskCollection.

Besides $filter and $orderby query options, other OData query options are supported for the TaskCollection entity set. More information about the OData query options can be found at the odata.org website. Note that not all OData query options are supported for the given Task entity; the list of the supported URLs for the entity set can be found in the official documentation.

 

Releasing a Task

While processing a task, the following situation can occur: a person started to work on a task (i.e. claimed the task) and after that it was decided that the task should be released to be available for another person. In our business scenario, we can have a situation like the following: a financial specialist started to work on customer data to determine a credit limit but after that, he realized that his colleague has a great experience working with customers from this particular area of business and this colleague can determine the credit limit more precisely. In this case, the task should be released by the initial user to be claimed by another colleague.

To release a task, the BPM Tasks OData service provides Release function import, which takes the task’s instance ID as a parameter. Only the user who is working on a task, i.e. the task’s actual owner, can release the task.

 

The table below shows the URL used to release a task along with the service response:

HTTP MethodPOST
URL
.../bpmodata/tasks.svc/Release?InstanceID='02f6e30632b911e485d300000034b6ba'
Request Headers
AuthorizationBasic dXNlcm5hbWU6cGFzc3dvcmQ=
X-CSRF-Token781057a9-b96a-468c-b393-981f98292335
Acceptapplication/json

Response Body

(simplified)

{
-d: {  -results: [   -{    +__metadata: { … },    TaskDefinitionName: "Verify Customer Data",    TaskTitle: "Verify Customer Data",    Priority: "HIGH",    Status: "READY",    CreatedOn: "/Date(1409673092830)/",    CreatedBy: "Administrator",    CreatedByName: "Administrator",    Processor: "",    StartDeadLine: "/Date(1409674893053)/",    CompletionDeadLine: "/Date(1409675193053)/",    ExpiryDate: "/Date(1409676693053)/",    IsEscalated: false,    SupportsClaim: true,    SupportsRelease: false,    SupportsForward: true,    SupportsComments: true,    IsSubstituted: false,    SubstitutedUser: "",    +UIExecutionLink: { … }    +TaskDefinitionData: { … }    +CustomAttributeData: { … }    +Description: { … }    +Comments: { … }    ...
}

 

The service response for the function import contains Task entity, which corresponds to the released task.

 

Implementing the UI

Having all the necessary information about how to consume the BPM Tasks OData service for the simple business scenario, it is time to implement the UI for the aforementioned operations. This section describes how to create a UI for accessing a collection of tasks, including filtering and ordering of tasks, as well as how to create a UI to release a task.

 

Implementing a UI for Accessing a Collection of Tasks

A simple UI for accessing a collection of tasks can be implemented in a form of a table with rows corresponding to the available tasks for the current user:

tasks_table.png

 

As usual, the corresponding SAPUI5 view and the controller have to be implemented. At this step, it is enough to implement only onInit() function in the controller:

onInit : function() {    var tasksServicePath = "/bpmodata/tasks.svc/";    var tasksODataModel = new sap.ui.model.odata.ODataModel(tasksServicePath, true);    tasksODataModel.setDefaultCountMode(sap.ui.model.odata.CountMode.None);    tasksODataModel.setDefaultBindingMode(sap.ui.model.BindingMode.OneWay);    this.getView().setModel(tasksODataModel);
}

 

In the function, ODataModel to communicate with the BPM Tasks OData service is created. The BPM Tasks OData service allows only reading of Task entities; therefore, sap.ui.model.BindingMode.OneWay binding mode can be used.

 

This is it! Just some lines of code in the controller are needed to be able to read the collection of tasks.

 

In order to represent a collection of tasks as a table the following sap.ui.table.Table should be created in the SAPUI5 view:

var tasksTable = new sap.ui.table.Table({title: "Tasks"});
tasksTable.addColumn(new sap.ui.table.Column({    label: new sap.ui.commons.Label({text: "Task Title"}),    template: new sap.ui.commons.TextView({text: "{TaskTitle}"})
}));
tasksTable.addColumn(new sap.ui.table.Column({    label: new sap.ui.commons.Label({text: "Status"}),    template: new sap.ui.commons.TextView({text: "{Status}"})
}));
tasksTable.addColumn(new sap.ui.table.Column({    label: new sap.ui.commons.Label({text: "Priority"}),    template: new sap.ui.commons.TextView({text: "{Priority}"})
}));
tasksTable.addColumn(new sap.ui.table.Column({    label: new sap.ui.commons.Label({text: "Creation Date"}),    template: new sap.ui.commons.TextView({text: {        path: "CreatedOn",        formatter: function(date) {            if (date) {                var dateFormatter = sap.ui.core.format.DateFormat.getDateTimeInstance({style: "medium"});                return dateFormatter.format(date);            }            return "";        }    }})
}));
tasksTable.bindRows("/TaskCollection", null, null, null);

 

The table created by the snippet above is implemented to show title, status, priority and creation date of a particular task. To show values of the corresponding properties of the Task entity type, binding expressions such as {TaskTitle} are used. The table is bound to the TaskCollection entity set using bindRows() method. The first method parameter specifies the OData resource path that is relative to the URL, for which the ODataModel has been created. During the binding, a service request to the following URL will be sent:

 

http://<host>:<port>/bpmodata/tasks.svc/TaskCollection

 

Filtering of Tasks on the UI

Having the code to create the table does not mean that the created table will be populated with the tasks. As you remember, it was mentioned that requests to the TaskCollection are not allowed without $filter query option. This means that $filter query option should be defined before sending requests to the OData service.

 

To specify filtering criteria, SAPUI5 provides sap.ui.model.Filter class. Let us assume that we need to create criteria to get only ready tasks. For that purpose, the following Filter should be created:

var readyStatus = new sap.ui.model.Filter("Status", sap.ui.model.FilterOperator.EQ, "READY");

The constructor requires binding path, operator and filtering value to be provided. In our case, we create a Filter to specify that Status property of Task entity should be equal to READY value. Now it is time to use the created Filter during sending requests to the OData service. For that purpose the created Filter should be specified as a parameter of bindRows() method for the table:

tasksTable.bindRows("/TaskCollection", null, null, [readyStatus]);

 

As a result, the following URL will be sent to the service during the binding:

 

http://<host>:<port>/bpmodata/tasks.svc/TaskCollection?$filter=Status eq ‘READY’

 

Now, during the view rendering, the created table will be populated with ready tasks, for which the current user is a potential owner.

 

In the previous example, only one Filter has been created. But what if multiple Filters are required? Let us assume that all the ready and reserved tasks for the current user should be shown in the table. For that purpose, a new Filter should be created to indicate that reserved tasks should be returned as well.

 

var reservedStatus = new sap.ui.model.Filter("Status", sap.ui.model.FilterOperator.EQ, "RESERVED");

After that the created filter should be added to the array of filters that is passed to bindRows() method:

 

tasksTable.bindRows("/TaskCollection", null, null, [readyStatus, reservedStatus]);

 

As a result, the following URL will be sent to the service during the binding:

 

http://<host>:<port>/bpmodata/tasks.svc/TaskCollection?$filter=Status eq ‘READY’ or Status eq ‘RESERVED’

 

It shows that if multiple Filters are provided for the bindRows() method, they are connected with OR operator in the $filter expression. In case the Filters should be connected with AND operator, for example, to filter tasks by status and priority, a combined Filter should be created:

var readyStatus = new sap.ui.model.Filter("Status", sap.ui.model.FilterOperator.EQ, "READY");
var highPriority = new sap.ui.model.Filter("Priority", sap.ui.model.FilterOperator.EQ, "HIGH");
var filterByStatusAndPriority = new sap.ui.model.Filter([readyStatus, highPriority], true);

 

The last argument of the Filter constructor indicates whether the specified Filters should be connected with AND operator. Providing such a Filter for bindRows() method:

tasksTable.bindRows("/TaskCollection", null, null, [filterByStatusAndPriority]);

leads to the following URL sent to the service

 

http://<host>:<port>/bpmodata/tasks.svc/TaskCollection?$filter=Status eq ‘READY’ and Priority eq ‘HIGH’

 

Ordering of Tasks on the UI

In the previous section, filtering of tasks was described. But what if the returned tasks should also be ordered by specific attributes? For that purpose, SAPUI5 provides sap.ui.model.Sorter class. Let us assume that we need to order tasks by the creation date to show the latest tasks at the beginning of the table. This means that the ordering by the creation date should be descending. For that purpose, the following Sorter should be created:

 

var sorter = new sap.ui.model.Sorter("CreatedOn", true);

 

The last parameter of the constructor indicates whether the sorting should be descending. To apply the Sorter it should be specified as a parameter for the bindRows() method:

 

tasksTable.bindRows("/TaskCollection", null, sorter, [readyStatus, reservedStatus]);

 

As a result, the following URL will be sent to the service during the binding:

 

http://<host>:<port>/bpmodata/tasks.svc/TaskCollection?$filter=Status eq ‘READY’ or Status eq ‘RESERVED’&$orderby=CreatedOn desc

 

Implementing a UI to Release a Task

As it was mentioned before, to release a task, POST HTTP request should be sent to the service to call Release function import. For that purpose, a new function should be implemented in the SAPUI5 controller:

 

release : function(taskInstanceId) {    if (taskInstanceId) {        var urlParameters = {            "InstanceID" : decodeURIComponent(taskInstanceId)        };        var functionParameters = {};        functionParameters.method = "POST";        functionParameters.urlParameters = urlParameters;        functionParameters.success = function() {            alert("Task released!");        };        functionParameters.error = function() {            alert("Release failed!");        };        var tasksODataModel = this.getView().getModel();        tasksODataModel.callFunction("Release", functionParameters);    }
}

 

The function takes an encoded task instance ID as a parameter. That is exactly how it is returned by the BPM Tasks OData service. To call a function import, ODataModel provides the callFunction() function. The function requires two input parameters: function import name and configuration object to specify how the function import should be called. The configuration object is used to specify the HTTP method that should be used to call the function import, a map of the function import URL parameters and functions that should be called in case the function import call is succeeded or failed. Once the specified function import is called, callFunction() function refreshes the ODataModel. This means that once the task is released the corresponding row in the table of tasks on the UI will be updated automatically.

 

Conclusion

This part has shown the implementation of a UI for accessing a collection of tasks and releasing a task. The technical details of this functionality have been described and it was shown how to consume the corresponding operations from SAPUI5. Having UIs for working with task data, task metadata and claiming a task along with the UIs described in this blog post, it is possible to implement a custom inbox that combines all the mentioned features in one place. Such an inbox can also be enhanced by adding support of other task-related operations that are supported by the BPM Tasks OData service. More information about other functionalities that are provided by the BPM Tasks OData service can be found in the official documentation.

Creating OData-based SAPUI5 UIs for BPM tasks – the easy way…

$
0
0

In his blog post “BPM OData: Implementing a Basic Custom Task Execution UI”, Andre demonstrates how the OData service of BPM combined with the power of SAPUI5 can be leveraged to create modern custom UIs for your BPM tasks. One thing, however, is missing compared to previous technologies like Web Dynpro (Java) and Visual Composer: the ability to quickly generate a rough UI based solely on the process information using the ‘New Task’ wizard. Providing a structure and a lot of the necessary boilerplate coding is extremely useful, both for just trying out some prototypes, for example, in trainings, and also as a starting point for custom implementations.

 

Starting with SAP NetWeaver 7.3 EHP 1 SP 13, the enhanced ‘New Task’ wizard now provides such ability. The first version has some limitations (see below), but is already fully useable for a simple roundtrip.

Let’s take the same process Andre uses in his blog:

Process.png

This is the XSD for the data objects (also used for the process start):

 

<?xml version="1.0" encoding="UTF-8"?><schema xmlns="http://www.w3.org/2001/XMLSchema" targetNamespace="http://www.example.org/Customer"      xmlns:tns="http://www.example.org/Customer" elementFormDefault="qualified">      <complexType name="Customer">            <sequence>                  <element name="firstName" type="string"></element>                  <element name="lastName" type="string"></element>                  <element name="address_street" type="string"></element>                  <element name="address_city" type="string"></element>                  <element name="address_zip" type="int"></element>                  <element name="address_country" type="string"></element>                  <element name="currency" type="string" default="EUR"></element>            </sequence>      </complexType>      <complexType name="Credit">            <sequence>                  <element name="creditLimit" type="double"></element>            </sequence>      </complexType></schema>

(Note that I used underscore in the element names instead of hyphens as the wizard disallows those)

 

In the properties of the human activity ‘Verify Customer Data’, we can now choose to create a new task:

Properties_NewTask.png

 

The ‘New Task’ wizard opens providing an option to create an UI implementation for us:

NewTaskWizard1.png

 

Of course, we select the new ‘SAPUI5’ technology.

NewTaskWizard2.png

 

The generated files will be written to a Web-DC which can then be easily deployed using the standard deployment process (using an EAR project). If no Web-DC exists yet, simply create one using the “New” Button.

 

In the next step, we have to specify the names of the component, complete and error event. The wizard already provides defaults, so we do not need to change anything, here.

 

NewTaskWizard3.png

 

Each generated UI will use a separate folder to enable placing several UIs in the same Web-DC. The name of this folder is taken from the component name. In this example, I changed it from the default VerifyCustomerComponent to VerifyCustomer, to create a nicer URL.

 

Finally, we can choose the context attributes we want to include in this UI from the data objects.

NewTaskWizard4.png


Internally, a new XSD is created from the selection made, here. This allows freely combining attributes from different data objects. For simplification, the same XSD type is used for both input and output as well as for faults. Those could be changed later in the task properties, if the generated files are accordingly adjusted.

 

The first version of this generator has a few limitations:

  • Only simple types of text, numbers and booleans (displayed as checkboxes) are supported
  • No complex types (with the exception of a root complex type containing all other simple types, as shown in this screenshot)
  • No array types
  • No date and time types

 

The wizard will create a new task of the ‘Custom UI Component’ variant. It also assigns the internally generated XSD to it.

 

The URL (displayed in the ‘Component’ field) needs to point to the full (relative) path to the new UI, which resides in the Web-DC. Don’t worry, the wizard will set it automatically for you. You can see my change to the component name reflected:

 

CreatedCustomTask.png

Relative URLs are used here, but when using the BPM Inbox, the base URL with the hostname needs to be configured. If this configuration differs from the actually used URL (for example, domain name and IP address), the OData calls will fail due to the “Same Origin" security policy.

 

Adding input and output mapping can be done in the same way as for any task.

 

That’s it – now, you can deploy and run your process and the task is useable.

 

GeneratedUI_A.png

 

For further development of SAPUI5, please, refer to the respective documentation (see also the Development Toolkit).


But to get you started I'll explain a few basic changes to the generated UI to make it look nicer.


Have a look into the generated folder structure:

FolderStructureFull.png

Index.html is the file, which is referred to by the ‘custom UI Url’ (‘Component’ field in the task) and thus called from the BPM Inbox. The main display description (the ‘view’) is in the file DO_CustomerForm.fragment.xml, where fragment implies that it is rendered as part of an outer container. When we open this file, we can re-arrange the XML elements and thus achieve the desired order of fields.


The displayed texts are externalized (ready for localization) and taken from the messageBundle.properties file. You can freely change that content to, for example, translate your UIs. The defaults are uppercase versions of the attribute element names, for easier identification, which values needs to be adjusted.


Caveat: the file needs to be in UTF-8 format. NWDS sometimes does not correctly recognize the file encoding, which leads to strange looking artifacts when characters outside the US-ASCII 7bit range are used. Thus, make sure to set UTF-8 in the properties of the file before opening it in the internal Editor!

 

UTF8.png

 

Now this looks much nicer:

GeneratedUI_A1.png

Nothing has changed? Remember to clear your browser cache.


The generated UI uses OData for back-end communication. This is fully automated and needs no further effort. You can see that it works by completing the task and verifying the changed data in the process context:

ProcessContext1.png


Aside from reading the task input data and writing the changed data back, the generated logic also automatically claims the task upon opening the Task Execution UI and closes the browser tab when the task has been completed. Of course, this can be disabled by changing the generated source.


The process flow would now expect to have this data confirmed (or changed – let’s do that) and as a second task a credit limit added. Again, use the ‘New Task’ wizard to create an UI, selecting those fields:

NewTaskWizard4_b.png

Note: explicitly not selecting the DO_Customer / DO_CreditLimit nodes will ensure that a single complex type consisting of only simple types will be generated (and named ‘Form’)


The generated files /CreditLimit/view/FormForm.fragment.xml and CreditLimit/i18n/messageBundle.properties can again be adjusted as described above. In this case, I added editable=”false” to the TextFields in the XML to create read-only UI input fields for all attributes except currency and creditLimit. I also mapped only those two in the Task Output Mapping.


The result looks like this – a nice UI, with custom field labels, sorted, partly read-only. It displays the changed data from the first task UI and lets me enter currency and credit limit:

GeneratedUI_C2.png

 

Further improvements of the generator might be adding generated code for fault generation (like a ‘Cancel’ button) and an enhanced type support. But with the enhanced ‘New Task’ wizard and the automatic SAPUI5 generation for a task, the main work is done, and thus, you have a powerful tool as a first step for your own customized UIs.



Appendix


For deployment of the Web-DC, an ‘EAR Project’ is needed. If there is not already one available, a new one could be created  as new project of type ‘Development Component’ and by selecting ‘JavaEE’ / ‘Enterprise Application’. The last page in the wizard offers the possibility to add referenced projects. Alternatively, the reference can be added later in the ‘Component Properties’ (right-click on the project and select ‘Development Component’ / ‘Show In’ / ‘Component Properties’). On the second tab, ‘Dependencies’, the Web-DC project can be added.

Now, when deploying this EAR, the referenced Web-DC is deployed as well.

Product over Process

$
0
0

     Every organization prides itself in following the due process necessary to deliver the product. However, in reality Product drives all the decisions in the organization. As the saying goes, “All is well that ends well ", we tend to believe that all deviations in process are justified as long as we are able to deliver the product/solution to the customer.

 

Just walk into the bookstore and reach out to the Business/IT section, you will find numerous books on following the process. Join an organization and the first set of trainings always talk about following the process to deliver quality solution. At the start of project, the team management outlines the steps (process) for the project execution. I can safely assume (based on my gut-feel analytics) that every professional would have been part of at the least one of the above scenarios. If everything around us talks of process, it is truly a mystery that eventually product takes precedence.

 

Two instances in the recent time have provided me some insight into this issue. One was in a recent article on the contrasting results of the India England series by a noted cricket commentator, he quotes: "“If there is anything to take from this India vs England season it is that people will tend to be good at things they like doing”. He goes on say that within us we need to like what we do to actually succeed. Second was a book on Self-confidence. In the first chapter the author speaks of the power of our beliefs. Our beliefs dictate the outcome. Any amount of will power cannot overcome our beliefs. The aspect which is highlighted in the article and book is the inner belief and how the lack of it can actually result in failure in spite of all our external efforts.

 

As an individual we have a set of beliefs. We are guided by our belief system. Within the professional world, no amount of external effort can overcome the inner belief that at times we have to deviate from the process. We also tend to believe that sticking to process is being stubborn. More importantly, everyone cares about the final outcome and end justifies the means. Now, this belief system is partly self-created but mostly the information perceived from the outside environment. How many times has a manager appreciated the team for sticking to the process, even though the end product was not as desired? The first reaction is something was wrong and hence the solution went wrong. It’s the belief that process should always lead to an outcome desired by the team. Unfortunately, process is not biased towards the team or client. It is designed for a purpose, which may not always be good in the short term but will definitely be optimal in the long-term.

 

To review our beliefs w.r.t. Process, we need to recognize the existence of such a belief system. Every manager will project and quote, but only few follow. The information percolates from top to down. If something is not ideal at the bottom layer and the fix may be at the top. In my opinion we can address the situation by following some simple counselling steps:

 

1. Recognize the existence of the problem

 

2. Identify the scenarios in which we tend to overlook the process i.e. encounter the problem.

 

3. Form an action plan to self-instruct in such situations

 

4. The HARD part: In that moment( of TRUTH ), to actually stick to your action plan.

 

To amount of BPM tools or processes can solve the issue than a change in the mindset of the individuals and the collective teams. Now, I realize why the cricket world cup victory in 1983 is attached so much importance. Its because, it instilled the new belief that India can win. So, its upon us to lead the way for others.

 

Ready. Set. Go.

 

The opinions expressed in this blog are personal. If any relevance is found, it is purely co-incidental.

Process Observer Customer Survey 2014

$
0
0

Dear Process Observer customers,

 

 

We kindly ask you to take part in the 2014’s survey on Process Observer offered by SAP Suite Engineering Foundation & Applications.

 

Process Observer (POB) is a component of the SAP Business Suite that allows customers to monitor and analyze the performance of their SAP Business Suite business processes. Process Observer is part of SAP’s BPM strategy. You can find out more about SAP’s BPM strategy in this document in SCN.

 

The focus of this survey is the collection of information on the use of Process Observer and the benefits realized. We are also asking for input for future development directions. The target group of the survey is employees and consultants from companies using Process Observer as part of their solution for monitoring and analyzing their business processes.

 

The survey contains 19 short questions. You do not have to fill out all of the questions.

 

You can access the 2014 survey here:

https://www.sapsurvey.com/cgi-bin/qwebcorporate.dll?idx=JMN7BZ

 

Time Schedule

The survey ends on November 28th 2014.

We plan to publish selected results on SAP SCN in December 2014 or January 2015.

 

For more background about Process Observer, see our central page in the SCN.

 

Thank you for your participation!

Questions and feedback are welcome.

 

 

Best regards,

Bernd Schmitt

BPM Troubleshooting for Beginners

$
0
0

Hi All,

 

Many documents and blogs have been written about How to Develop BPM Components, How to integrate external components into BPM, How to consume BPMs in other applications, BPM APIs etc.

But, after the development of these objects, when the time comes to actually use them, run them on the server for testing, quality or even end-user usage, very little is known to BPM Developers at the beginner level as to how and where can we check what is going wrong with our processes.

Where is it stuck? Why is it stuck? Is there a technical issue? Is there a modelling issue? ..... and the list goes on.

 

In this blog, I would like to help you guys out with a few steps which can help you in finding the causes of disruptions of your process executions on the server.

 

For finding the exact root cause of the problems, we will follow a classic 4-step approach, which should be followed in chronological order while troubleshooting your BPM Processes in the NetWeaver Administrator, which can be accessed by logging in to "http://<host>:<port>/nwa".

 

Note: For troubleshooting your Business Processes, you need to have appropriate authorizations on the server. You can refer This Article for all roles & authorizations related to BPM. I would suggest having the "BRM_SuperAdmin" role.

 

Please follow these steps in the exact order in which they are listed.

 

Step 1: Process Repository-Error Log


Log-in to NWA --> Configuration --> Processes and Tasks --> Process Repository:



 

nwa1.jpg



Once you click on the Process Repository, you will see all the BPM Components deployed on your server. Select your concerned BPM Component, Scroll the page down, select the Parent Process Instance & click on the Process Link:

 

nwa2.jpg

 

This will open up the Process Instance Repository (which will contain all the Process Instances-Running and completed) for the selected BPM Component:

 

nwa3.jpg

 

 

The very first step we will do here is to click on thenwa4.jpg button, which will open up the process design of the BPM, to see where exactly the process is stuck. If the process is stuck at a Human Activity, it will have a RED token on that particular activity.

                                                                               

Moving ahead, if you see this repository closely, you will have a lot of options which you can operate upon your BPM Process Instances.

 

At the bottom, you will see a tab-strip:

 

nwa3.jpg

 

The last tab here is that of the "Error Log". If the process instance is in a Suspended/Error status, this error log gets enabled and you will be able to find the full error stack trace here. In case the status of the Process is OK/In-progress, this tab will be greyed out (as above).

 

However, there are quite a few instances where the process is erroneous but you may not find anything in the error-log tab. This is when we go to the 2nd step.

 

Step-2:Process Repository-History

 

In the same tab-strip above, the 4th tab you will see is the "History" tab. This section will actually list out all the process execution steps of the BPM Process execution.

If step-1 does not help you, click on this History Tab and from the available dropdown-options, select Advanced:

 

nwa5.jpg

In the description details listed, you will be able to find the exact detailed error trace in approximately the 3rd entry.

 

Note: I would personally suggest to copy this full error trace onto a notepad and then check the problem. Sometimes, the table limits the display of the error text.

 

This step traditionally takes care of Process Modelling issues, Data and other issues which might not be build-errors, but run-time errors.

The only error this step may not track is a Service Execution in an Automated Activity, in which case, we move to step-3.

 

 

Step-3:NWA-Connectivity Logs

 

In NWA Home-page, go to SOA --> Logs & Traces --> Connectivity Logging & Tracing:

 

nwa6.jpg

 

When you open this, you will see all the error traces pertaining to service connectivity errors. You may search for your interface (which has failed to execute and we identified it in Step-2, but we do not know the exact error-cause) and check for the errors:

 

nwa7.jpg

 

You are bound to find the final error here. But, however, if you fail to do so (which is a very rare case), we move to Step-4.

 

Step-4:NetWeaver Logs

 

You can directly access the NetWeaver error-logs by logging-in to http://<host>:<port>/nwa/logs.

 

This log engine stores all the error logs of the errors which occur on the server (of all the running applications, as well as the configurations logs).

 

Firstly, change the error-log prioritization as shown in the below snapshots:

 

nwa8.jpg            

nwa9.jpg

You may enter the Development Component Name in the search criteria to search for errors pertaining to your Development Component.

You may then expand the error by clicking on the "+" sign on the left of the selected error entry and then check for the error details:

 

nwa11.jpg

 

Note: Please use this step as the last resort to finding the error details. It is very time-consuming to search for a single error among thousands of records on the server.

 

 

That's it. I think these steps above will surely help somehow to track the cause of the error.

But please note that you should follow these steps in the order in which they are listed.

 

For an overview of the errors on all the BPM Components on a whole, you may check the BPM-System Overview option in NWA.

Check This Document for BPM System Overview.

 

You may also refer the excellent article: A day in the life of an SAP NetWeaver Business Process Management Administrator by Birgit Heilig

 

Hope this helps.

 

Cheers.

 

Sid.


BPM OData: Implementing a Process Start UI

$
0
0

This blog post, as part 5 of BPM OData blog series, refers to the OData Service in SAP NetWeaver BPM available with SAP NetWeaver 7.3 EHP 1 SP 09 and higher. The features described in this blog post are available with SAP NetWeaver 7.3 EHP 1 SP 11 and higher. Before reading this blog post, it is recommended to read the previous parts of the BPM OData blog series.

Overview

In all the previous blog posts about the BPM OData service, we were considering a business scenario that is related to processing of customer data. In all the cases, the customer data was already available and the purpose of the blog posts was to describe how to implement a UI to work on the provided data. Some of you were probably wondering how a customer provides his data and whether it is possible to implement a SAPUI5 UI for that purpose. And what about the OData? Are there any OData services in SAP BPM that can be used for that purpose? The answer to all these questions is “Yes”. And this blog post answers the aforementioned questions in more detail.

The purpose of this blog post is to describe how to implement a UI to provide customer data. In our business scenario, such a UI can be considered as the entry point for a customer who wants to provide his data to the credit institution. Obviously, without the provided data neither the customer service employee nor the financial officer will be able to start working on the customer data to verify it and provide a credit limit after that. It means that the submission of the data by the customer can be considered as the event, which starts the entire process. In such a case, the customer data itself can be considered as the start data for such a process.

 

Process Model

If we represent the business scenario as a process in SAP BPM, submission of the customer data can be considered as the process start event and the customer data itself can be considered as the process start data. In this blog post, as well as in all the previous blog posts, the following process model is used for the business scenario:

process_model.pngIn order to start a process in SAP BPM, a process start event should be triggered. As it was mentioned previously, in our business scenario, the process start event represents the submission of the data by the customer. To allow the customer to submit his data it is necessary to define data for the process start event. For that purpose, a custom event trigger has been defined having service operation with the following input data structure:

<complexType name="Customer">    <sequence>        <element name="firstName" type="string"></element>        <element name="lastName" type="string"></element>        <element name="address" type="tns:Address"></element>        <element name="currency" type="string" default="EUR"></element>        <element name="phone-numbers" type="string" maxOccurs="unbounded"            minOccurs="0"></element>        <element name="vcards" type="tns:Vcard" maxOccurs="unbounded"            minOccurs="0"></element>    </sequence></complexType><complexType name="Address">    <sequence>        <element name="street" type="string"></element>        <element name="city" type="string"></element>        <element name="zip" type="integer"></element>        <element name="country" type="string"></element>    </sequence></complexType><complexType name="Vcard">    <sequence>        <element name="attr1" type="string"></element>        <element name="attr2" type="string"></element>        <element name="attr3" type="string"></element>    </sequence></complexType>

More information about how to model process start events can be found in the official documentation.

If we compare the defined process start data with the input data of Verify Customer Data task, we can see that the structure of the data is the same. As a result, once the process is started, its start data is mapped to the input data of Verify Customer Data task to enable customer service employee to verify the provided data.

 

Having all the required changes in the process model, it is time to start the process.

 

BPM Process Start OData Service

Starting from SAP NetWeaver 7.3 EHP 1 SP 11 the BPM OData service provides functionality to start a BPM process. For that purpose, BPM Process Start OData service should be used. This service along with all the other BPM OData services is available under ‘bpmodata’ root URL and has the name ‘startprocess.svc’. The BPM Process Start OData service provides the following operations:

  • Access to Process Start Data
  • Start a Process

Besides the supported operations, the BPM Process Start OData service as well as all the OData services should provide the EDM service metadata to describe the service entity model. The metadata of the service contains two standard entity types and a number of entity types to represent a process start data. The standard entity types are named StartData and ProcessStartEvent. The purpose of the StartData entity type is to represent information about the process, which can be started using the OData service. ProcessStartEvent entity type is referenced by StartData entity type and is used as a wrapper around the entity type representing the process start data itself. The entity type for the process start data itself is an EDM representation of the XSD complex type, which is defined as the input for the service operation to trigger the process start event. In our case, the entity type for the process start data corresponds to the Customer XSD complex type and has the following structure:

<EntityType Name="Customer">    <Key>        <PropertyRef Name="EDM_Key" />    </Key>    <Property Name="EDM_Key" Type="Edm.String" Nullable="false" />    <Property Name="firstName" Type="Edm.String" Nullable="true" />    <Property Name="lastName" Type="Edm.String" Nullable="true" />    <Property Name="currency" Type="Edm.String" Nullable="true"    DefaultValue="EUR" />    <NavigationProperty Name="address"        Relationship="BPMProcessStart.Customer_Address" FromRole="Customer"        ToRole="Address" />    <NavigationProperty Name="vcards"        Relationship="BPMProcessStart.Customer_Vcard" FromRole="Customer"        ToRole="Vcard" />    <NavigationProperty Name="phone-numbers"        Relationship="BPMProcessStart.Customer_phone-numbers" FromRole="Customer"        ToRole="phone-numbers" /></EntityType><EntityType Name="Address">    <Property Name="street" Type="Edm.String" Nullable="true" />    <Property Name="city" Type="Edm.String" Nullable="true" />    <Property Name="zip" Type="Edm.Decimal" Nullable="true" />    <Property Name="country" Type="Edm.String" Nullable="true" /></EntityType><EntityType Name="phone-numbers">    <Key>        <PropertyRef Name="EDM_Key" />    </Key>    <Property Name="EDM_Key" Type="Edm.String" Nullable="false" />    <Property Name="phone-numbers" Type="Edm.String" Nullable="true" /></EntityType><EntityType Name="Vcard">    <Key>        <PropertyRef Name="EDM_Key" />    </Key>    <Property Name="EDM_Key" Type="Edm.String" Nullable="false" />    <Property Name="attr1" Type="Edm.String" Nullable="true" />    <Property Name="attr2" Type="Edm.String" Nullable="true" />    <Property Name="attr3" Type="Edm.String" Nullable="true" /></EntityType>

More information about generation of EDM metadata from XSD can be found in the official documentation.

 

The process start data is defined at the process design time and is different for different processes. It means that in order to generate the metadata containing EDM representation of a particular process start data the process identifier should be provided to the BPM Process Start OData service. Every process in SAP BPM can be uniquely identified by the name of its vendor, name of the development component where the process is located and the process name. It means that all these identifiers should be provided to the BPM OData service to generate EDM metadata for the process start data. In general, every URL for the BPM Process Start OData service should contain all the process identifiers to identify the process which start data should be retrieved or the one, which should be started. As a result, all the URLs of the service can be represented using the following pattern:

 

http://<host>:<port>/bpmodata/startprocess.svc/<vendor>/<dcName>/<process_name>/<OData_resource_path_and_query_options>

 

In the pattern above, <vendor>, <dcName> and <process_name> represent process vendor name, name of the process development component and the process name respectively.

 

Accessing Process Start Data

Before starting a process its start data should be retrieved to get information about its structure. After that, the retrieved data should be modified and sent to the OData service in order to start a process.

 

The table below shows the URL used to access process start data along with the service response:

HTTP MethodGET
URL…/bpmodata/startprocess.svc/test.sap.com/tc~bpem~customer~process/CreateCustomerProcess/StartData?$expand=ProcessStartEvent,ProcessStartEvent/Customer,ProcessStartEvent/Customer/address,ProcessStartEvent/Customer/phone-numbers,ProcessStartEvent/Customer/vcards&$format=json

Response Body

(simplified)

{  "d": {    "results": [      {        "vendor": "test.sap.com",        "dcName": "tc~bpem~customer~process",        "processTechnicalName": "CreateCustomerProcess",        "processInstanceId": null,        "ProcessStartEvent": {          "Customer": {            "firstName": null,            "lastName": null,            "currency": null,            "address": null,            "phone-numbers": {              "results": []            },            "vcards": {              "results": []            }          }        }      }    ]  }
}

 

The BPM Process Start OData service supports requests only to the StartData entity set. It means that in order to get the structure of the process start data itself, all the corresponding navigation properties should be expanded. For that purpose, the URL in the table contains the $expand OData query option with the corresponding values.

The list of the supported URLs for the StartData entity set can be found in the official documentation. More information about the OData query options can be found on the odata.org web site.

 

Starting a Process

In the previous section, it has been shown how to get the process start data using the BPM Process Start OData service. The process start data is represented as Customer entity in the service response. Since the process has not been started, its start data is empty, which results in having ‘null’ values for all of its properties. In order to start a process the start data that has been returned by the service should be modified by setting values for all of its properties. After that, the modified start data should be sent to the service. In our case, modification of the start data requires setting values for the properties of the Customer entity. For primitive properties like firstName it is quite simple because the property value is a string representing the customer name whereas for complex properties like address or vcards it can be difficult because the property value is not a primitive one and there is no information about the structure of the value in the service response.

In order to get information about the structure of address and vcards properties the service metadata can be used. Because both of the properties are complex ones they are represented as navigation properties in the Customer entity type.

<NavigationProperty Name="address"        Relationship="BPMProcessStart.Customer_Address" FromRole="Customer"        ToRole="Address" />    <NavigationProperty Name="vcards"        Relationship="BPMProcessStart.Customer_Vcard" FromRole="Customer"        ToRole="Vcard" />

Looking at the definition of a navigation property it is possible to get information to which entity type it points. Every navigation property has a ToRole attribute which contains the name of the entity type to which the property points. In our case, the address property points to the Address entity type. Based on the entity type structure

<EntityType Name="Address">    <Property Name="street" Type="Edm.String" Nullable="true" />    <Property Name="city" Type="Edm.String" Nullable="true" />    <Property Name="zip" Type="Edm.Decimal" Nullable="true" />    <Property Name="country" Type="Edm.String" Nullable="true" /></EntityType>

the following entity can be created:

{    "street": "Main str.",    "city": "Walldorf",    "zip": "69190",    "country": "Germany"
}

Such an entity should be specified as the value of the address property of Customer entity. The entity for the vcards property can be created in the same way based on the Vcard entity type. The only difference is that vcards is a multi-valued property and the created entity should be specified in scope of a results[] array.

 

In general, the OData service metadata should always be used when it comes to specify a value for a property of the OData entity. Based on the metadata you can find the type of the property and specify the value in the required format. The metadata also allows determining the structure of the property value when it is a navigation property. More information about how to specify values for properties in OData can be found on the odata.org web site.

 

Having the JSON object for the Customer entity it is time to send it to the BPM Process Start OData service in order to start the process. For that purpose, a POST request should be sent to the StartData entity set.

 

The table below shows the URL used to start a process along with the service response:

HTTP MethodPOST
URL…/bpmodata/startprocess.svc/StartData
Request HeadersAuthorizationBasic dXNlcm5hbWU6cGFzc3dvcmQ=
X-CSRF-Token781057a9-b96a-468c-b393-981f98292335
Content-Typeapplication/json
Acceptapplication/json
Request Body
{    "ProcessStartEvent": {        "Customer": {            "firstName": "John",            "lastName": "Doe",            "currency": "EUR",            "address": {                "street": "Main str.",                "city": "Walldorf",                "zip": "69190",                "country": "Germany"            },            "phone-numbers": {                "results": [                    {                        "phone-numbers": "111-111-111"                    },                    {                        "phone-numbers": "222-222-222"                    },                    {                        "phone-numbers": "333-333-333"                    }                ]            },            "vcards": {                "results": [                    {                        "attr1": "John Doe",                        "attr2": "john.doe_at_provider.com",                        "attr3": "john.doe"                    },                    {                        "attr1": "J.D.",                        "attr2": "jd_at_provider.com",                        "attr3": "jd"                    }                ]            }        }    }
}
Response Body
{    "d": {        "vendor": "test.sap.com",        "dcName": "tc~bpem~customer~process",        "processTechnicalName": "CreateCustomerProcess",        "processInstanceId": "7367b9b33e5d11e485ae00000034b6ba",        "ProcessStartEvent": {            "Customer": {                "firstName": "John",                "lastName": "Doe",                "currency": "EUR",                "address": {                    "street": "Main str.",                    "city": "Walldorf",                    "zip": "69190",                    "country": "Germany"                },                "phone-numbers": {                    "results": [                        {                            "phone-numbers": "111-111-111"                        },                        {                            "phone-numbers": "222-222-222"                        },                        {                            "phone-numbers": "333-333-333"                        }                    ]                },                "vcards": {                    "results": [                        {                            "attr1": "John Doe",                            "attr2": "john.doe_at_provider.com",                            "attr3": "john.doe"                        },                        {                            "attr1": "J.D.",                            "attr2": "jd_at_provider.com",                            "attr3": "jd"                        }                    ]                }            }        }    }
}

 

The service response for the sent request contains the data with which the process has been started. The service response also contains a value for the processInstanceId property of the StartData entity. The property value contains the identifier of the started process instance.

 

Implementing the UI

Having all the necessary information about how to consume the BPM Process Start OData service it is time to implement a UI to start a process. As it was mentioned in the beginning of the blog post, a UI to start a process can be considered as the entry point for a customer of a credit institution. Such a UI can be represented as a form containing a number of fields where customer can enter his data. Once the form is filled, the ‘Submit’ button should be clicked to submit the data, i.e. to start the process.

Customer_Data_UI.png

 

Implementing a UI for Accessing Process Start Data

One of the main steps when implementing a UI for an OData service is to define a model, which is represented as sap.ui.model.odata.ODataModel instance in SAPUI5. In the previous blog posts, the ODataModel was used to retrieve the existing data (e.g. task input data) from the OData service and to resolve UI bindings with this data. In that case, the entity, returned by the OData service, had values for top-level properties (e.g. firstName) as well as for the nested properties (e.g. address). As a result, binding expressions like {Customer/address/city} were possible. In case of the BPM Process Start OData service, the situation is different. As it was shown previously, a process start data that is returned by the service is empty because it corresponds to a non-started process. In this case, UI binding for top-level properties like firstName will be resolved correctly whereas binding expression {Customer/address/city} will not be resolved because address property is empty and the internal city property is not available.

 

As you remember, we had a similar situation in the Starting a Process section when it was necessary to create a POST request body to start a process. In that case, the service metadata was used to determine the structure of the address navigation property. After that, the corresponding entity was created manually and set for the property in the request body. In general, we need to do almost the same before defining a model for the view of the process start UI.

 

Obviously, looking into the metadata and hard-coding structure of the navigation properties in the model is not a task an SAPUI5 developer dreams of. To make developer’s life easier there is the utility.txt.zip file attached to this blog post. The file provides the createEntityForEntityType() function which creates an OData JSON entity for the required entity type. In short, the function automates the manual process of the entity creation that was shown in the Starting a Process section. The function takes two parameters: OData service metadata and name of the entity type for which an entity should be created. The OData service metadata can be retrieved by calling the getServiceMetadata() function of the ODataModel. The sample usage of the functionality is shown below:

 

var startProcessSvcURL =
"/bpmodata/startprocess.svc/test.sap.com/tc~bpem~customer~process/CreateCustomerProcess";
this.processStartODataModel = new sap.ui.model.odata.ODataModel(startProcessSvcURL, true);
this.processStartODataModel.setCountSupported(false);
this.edmMetadata = this.processStartODataModel.getServiceMetadata();
var customerEntity = createEntityForEntityType(this.edmMetadata, "Customer");

The entity that is created for the entity type also contains entities for all the navigation properties of the entity type. The figure below shows the entity that is created for the Customer entity type:

 

{    "firstName": null,    "lastName": null,    "currency": null,    "address": {        "street": null,        "city": null,        "zip": null,        "country": null    },    "phone-numbers": {        "results": []    },    "vcards": {        "results": []    }
}

 

Now it is time to set this entity to the model to be able to have binding expressions for properties of the address property and for multi-values properties like phone-numbers and vcards as well.

Because every instance of the ODataModel is based on an OData service, changes in the model are queued internally as the corresponding requests to the OData service and should be submitted at the end. In our case, we only want to use hierarchical binding expressions for the nested properties without sending any requests to the OData service. Therefore, sap.ui.model.json.JSONModel fits better for our purposes:

 

var modelData = {};
modelData["Customer"] = customerEntity;
var jsonModel = new sap.ui.model.json.JSONModel(modelData);
var view = this.getView();
view.setModel(jsonModel);

 

The entire implementation of the onInit() function of the SAPUI5 controller containing all the aforementioned changes can be found in the attached startProcess.txt.zip file.

 

Working with Collections in Process Start Data

As it is shown on the screenshot of the sample UI, customer’s vCards are represented as tables. If we look at the customer entity, which has been created for the Customer entity type, we can see that the vcards property has an empty results[] array as its value. The results[] array is required by the OData protocol to represent a collection of items and it is empty because the collections of vCards is empty by default. Since a vCard is represented as an element of the results[] array, the UI table should be bound to the vcards property in the following way:

vcardsTable.bindRows("vcards/results");

Columns of each table row should be mapped to the corresponding properties of a vCard:

 

var attr1Column = new sap.ui.table.Column( {    label : new sap.ui.commons.Label( {    text : "attr1" } ),    template : new sap.ui.commons.TextField( { value : "{attr1}" } )
});
var attr2Column = new sap.ui.table.Column( {    label : new sap.ui.commons.Label( {    text : "attr2" } ),    template : new sap.ui.commons.TextField( { value : "{attr2}" } )
});
var attr3Column = new sap.ui.table.Column( {    label : new sap.ui.commons.Label( {    text : "attr3" } ),    template : new sap.ui.commons.TextField( { value : "{attr3}" } )
});
vcardsTable.addColumn(attr1Column);
vcardsTable.addColumn(attr2Column);
vcardsTable.addColumn(attr3Column);

Names of the vCard properties can be found in the service metadata.

By default, the table of vCards is empty but when the ‘Add’ button is clicked, a new row should be added to the table. Since the table is bound to the results[] array, it also means that a new entry should be added to the array. The added entry should represent an empty entity for the Vcards entity type. To create such an entity the already mentioned createEntityForEntityType() function should be called passing the name of the entity type for vCards. The code snippet below shows the implementation of a function that is called when the ‘Add’ button is clicked for the table of vCards.

 

addVcard : function() {    // create vCard entity    var vCardEntity = createEntityForEntityType(this.edmMetadata, "Vcard");    var vCardItem = jQuery.extend(true, {}, vCardEntity);    // add the created vCard to the table    var model = this.getView().getModel();    var vcards = model.getProperty("/Customer/vcards/results");    vcards.push(vCardItem);    model.refresh();
}

In the end of the function, the refresh() method is called for the model. The refresh of the model also includes a refresh of the bound UI controls. In our case, the table with vCards will be refreshed and a new row will be added to it.

 

The entire implementation of callbacks for ‘Add’ and ‘Remove’ buttons of the UI tables can be found in the attached startProcess.txt.zip file.

 

Implementing a UI for Starting a Process

On the sample UI, a customer clicks the ‘Submit’ button to submit his data, which automatically starts the process using the customer data as the process start data. The code snippet below shows a function that is called when the ‘Submit’ button is clicked.

 

startProcess : function() {    var startData = {};    startData.ProcessStartEvent = {};    var jsonModel = this.getView().getModel();    startData.ProcessStartEvent.Customer = jsonModel.getProperty("/Customer");    this.processStartODataModel.create("/StartData", startData, null,            function() {                alert("Your data has been successfully submitted.");            },            function(oEvent) {                alert("An error occurred while submitting the data.");            });
}

 

The implementation of the function is quite straightforward. Customer data is retrieved from the JSON model and wrapped into a process start event, which is required by the BPM Process Start OData service. After that, the created JSON object is used as request body for the POST request that is sent for StartData entity set. To send the request the ODataModel that has been created for the BPM Process Start OData service is used.

 

Conclusion

This part has shown the implementation of a UI to start a process. The technical details of the BPM Process Start OData service have been described and it was shown how to consume the service from an SAPUI5 application. A UI to start a process can be used as the entry point in the application having a BPM process in the back end.

The attached startProcess.txt.zip and utility.txt.zip files contain the implementation of the Process Start UI described in this blog post. These files can be used to ‘play’ with the UI and are not intended to be used for production purposes.

Don’t miss out on the SAP Middleware sessions at SAP TechEd && d-Code 2014 in Las Vegas

$
0
0

NEW, EXCITING, GAME-CHANGING

middleware_blog.jpg

SAP Middleware will be prominently covered at SAP TechEd && d-Code 2014.

 

Join the SAP Middleware Experts to learn collectively and individually how SAP is revolutionizing Middleware.


To make it easier for you, we created an overview below of all SAP Middleware-related sessions that will be held in Las Vegas (we will publish separate blogs for Berlin and Bangalore).

The sessions have been grouped into four main product areas:


  • Orchestration & Gateway: covering our SAP Process Orchestration portfolio, including SAP Process Integration, SAP Business Process Management (SAP BPM), SAP Business Rules Management (SAP BRM), and the SAP B2B Add-On. Learn for example where we are heading in the areas of mobilizing business processes, B2B and how SAP Process Orchestration benefits from SAP HANA. Also hear the latest and greatest about SAP Gateway and SAPApplication Interface Framework-..…
  • Operational Process Intelligence: powered by SAP HANA enabling you to act and react in real time providing lets you control situations before and as they are happening.  Using Operational Process Intelligence you can foresee an inventory running low before the incident, handle a customer complaint in real time, ensure pharmaceuticals are rerouted in times of crisis.
  • API Management allowing you to control not only your software but lets you connect to unprecedented devices. We recently announced our partnership with Apigee and the first release of SAP API Management, so there will be a lot to talk about…
  • HANA Cloud Integration: facilitating the integration of business processes on data across on-premise and cloud applications (cloud-cloud and cloud-on-premise integration). Learn about the latest catalog in SAP HANA Cloud Integration with newly developed content and experience how to discover, develop, and share ready-to-run integration content. and the cloud.
  • PowerDesigner: helping over 50,000 customers worldwide address their Enterprise Modeling and Documentation needs.  Now, as part of SAP’s Middleware portfolio, PowerDesigner is a critical component needed to capture, design and communicate the different aspects of your enterprise architecture before orchestrating and executing it.


Get to know why SAP Middleware is assuming leadership positions by analysts worldwide - learn about the current Middleware offerings, strategy, roadmap, hear success stories and help influence the direction.


Finally make sure to keep an eye out for these super special and game-changing themes, being promoted by our Middleware team: Intelligent Business Operations and Internet of Things


We encourage you to take part of our sessions and to schedule a 1-on-1 meeting where we can guide you on the ideal Middleware strategy for your organization.

 

To schedule a one-on-one meeting with our topic experts send an email here(please indicate your name, contact details and any specific topics you would like to discuss).


Looking forward to seeing you there!

Smadar Ludomirski on behalf of the SAP Middleware team


LAS VEGAS SESSIONS


Session ID

Title

Session Type

Day and Hour

ORCHESTRATION & GATEWAY

INT800

Road Map Q&A: SAP Process Orchestration

Product Road Map Q&A (1hr)

Tue 11:00 AM - 12:00 PM
  Wed 12:00 PM - 1:00 PM

INT803

Road Map Q&A: Intelligent Business Operations

Product Road Map Q&A (1hr)

Tue 2:00 PM - 3:00 PM
  Wed 10:00 AM - 11:00 AM

INT200

Cloud Integration – An Update on Our Strategy

Lecture (1hr)

Tue 2:00 PM - 3:00 PM
  Thur 9:15 AM - 10:15 AM

INT361

Error Handling in BPM-Based Processes Using SAP Process Orchestration

Hands-On Workshop (2hr)

Wed 10:30 AM - 12:30 PM
  Thur 8:00 AM - 10:00 AM

INT360

Best Practices to Implement SAP Process Orchestration, B2B Add-On Solution

Hands-On Workshop (4hr)

Wed 2:15 PM - 6:15 PM
  Thur 2:15 PM - 6:15 PM

INT300

L'Oreal - Purpose-Specific Dual Landscape - SAP PI & Overall Experiences

Lecture (1hr)

Tue 11:00 AM - 12:00 PM

INT264

Use SAP Process Orchestration for On-Premise to Cloud Integration

Hands-On Workshop (2hr)

Wed 8:00 AM - 10:00 AM
  Thur 10:30 AM - 12:30 PM

INT208

Experiences with SAP Cloud for Customer and SAP Process Orchestration

Lecture (1hr)

Thur 2:00 PM - 3:00 PM

INT207

Customer Connection for SAP Workflow – And How You Can Benefit

Lecture (1hr)

Thur 9:15 AM - 10:15 AM

INT206

Integrating Shop-Floor with Enterprise in Real-Time – SAP MII In Action

Lecture (1hr)

Wed 5:45 PM - 6:45 PM

INT205

Ariba Network Integration with SAP Leveraging Process Orchestration

Lecture (1hr)

Wed 8:00 AM - 9:00 AM

INT203

Solution Architectures for Process-, Data-, and User-Centric Integration

Lecture (1hr)

Tue 3:15 PM - 4:15 PM
  Wed 8:00 AM - 9:00 AM

INT202

SAP Process Orchestration as B2B Gateway – Business Partners Integration

Lecture (1hr)

Tue 5:45 PM - 6:15 PM
  Thurs 11:45 AM - 12:45 PM

INT802

Road Map Q&A: SAP Gateway – On Premise and in the Cloud

Product Road Map Q&A (1hr)

Tue 1:00 PM - 2:00 PM
  Wed 2:00 PM - 3:00 PM

INT108

Microsoft and SAP: Innovating for Your Success

Lecture (1hr)

Wed 10:30 AM - 11:30 AM

INT100

Integration and Orchestration – Overview and Outlook

Lecture (1hr)

Tue 11:00 AM - 12:00 PM
  Thurs 10:30 AM - 11:30 AM

EXP17707

Streamline and Simplify B2B Integration with an Advisor Tool

Networking Session (30min)

Wed 1:00 PM – 1:30 PM
  Thurs 1:00 PM - 1:30 PM

EXP17561

Enforce Reliability and Compliance with Redwood Financial Close Automation

Networking Session (30min)

Wed 11:30 AM - 12:00 PM

EXP17510

IFG Survey Results on SAP Process Integration and SAP Process Orchestration

Networking Session (30min)

Wed 11:00 AM - 11:30 AM

EXP17507

Running SAP Solutions on OpenStack Powered by SUSE Cloud

Networking Session (30min)

Tue 1:30 PM - 2:00 PM

EXP17476

SAP Application Interface Framework 3.0 – Preprocessing Function

Networking Session (30min)

Tue 4:00 PM - 4:30 PM

EXP17408

SAP Integration and Orchestration Solutions: Customer Showcase

Networking Session (30min)

Wed 4:00 PM - 4:30 PM

EXP17406

Innovate with SAP Gateway and SAP API Management

Networking Session (30min)

Tue 3:30 PM - 4:00 PM

CJ602

Modeling and Composing OData Services for Mobile and Cloud

CodeJam mini-editions (1hr)

Tue 4:00 PM - 5:00 PM

CJ600

Integrate SAP Data in Microsoft Outlook via SAP Gateway for Microsoft

CodeJam mini-editions (1hr)

Tue 12:15 PM - 1:15 PM

INT103

OData in SAP Process Orchestration

Lecture (1hr)

Tue 5:45 PM - 6:45 PM
  Thur 4:30 PM - 5:30 PM

OPERATIONAL PROCESS INTELLIGENCE

INT102

SAP Operational Process Intelligence powered by SAP HANA

Lecture (1hr)

Tue 12:15 PM - 1:15 PM
  Wed 11:45 AM - 12:45 PM

INT803

Road Map Q&A: Intelligent Business Operations

Product Road Map Q&A (1hr)

Tue 2:00 PM - 3:00 PM
  Wed 10:00 AM - 11:00 AM

INT105

The Internet of Everything: How it Will Transform the Way Business is Done

Lecture (1hr)

Tue 2:00 PM - 3:00 PM

INT261

Build SAP Fiori-Style User Interfaces for Mobile Usage of SAP BPM

Hands-On Workshop (2hr)

Tue 2:00 PM - 4:00 PM
  Thurs 2:00 PM - 4:00 PM

INT162

Building a Business Scenario in SAP Operational Process Intelligence

Hands-On Workshop (4hr)

Tue 2:15 PM - 6:15 PM
  Thurs 8:00 AM - 12:00 PM

INT164

SAP BPM – Build and Run a Simple Business Process End-to-End

Hands-On Workshop (2hr)

Wed 8:00 AM - 10:00 AM
  Wed 2:00 PM - 4:00 PM

INT201

Unified Inbox with SAP Fiori

Lecture (1hr)

Wed 10:30 AM - 11:30 AM
  Thur 8:00 AM - 9:00 AM

EXP17536

Intelligent Business Operations with the SAP HANA Platform

Networking Session (30min)

Wed 1:00 PM - 1:30 PM
  Wed 1:30 PM - 2:00 PM

EXP17712

How to Implement Automated Business Process Validation in Your Organization

Networking Session (30min)

Wed 3:30 PM - 4:00 PM

EXP17787

Before Big Data, Did Your SAP Software Do Little Data?

Networking Session (30min)

Tue 1:00 PM - 1:30 PM

EXP17786

BMC Control-M: Das Batch

Networking Session (30min)

Thur 11:00 PM – 11:30 PM

INT163

Building Smart Process Apps with SAP Operational Process Intelligence

Hands-On Workshop (2hr)

Wed 4:30 PM - 6:30 PM
  Thurs 4:30 PM - 6:30 PM

API MANAGEMENT

INT263

Consuming SAP Data in Google Apps by Leveraging SAP Gateway

Hands-On Workshop (2hr)

Tue 11:15 AM - 1:15 PM
  Wed 4:30 PM - 6:30 PM

CJ601

Secure OData – How Customers Leverage OAuth for Secure Integration

CodeJam mini-editions (1hr)

Tue 2:45 PM - 3:45 PM

INT103

OData in SAP Process Orchestration

Lecture (1hr)

Tue 5:45 PM - 6:45 PM
  Thur 4:30 PM - 5:30 PM

INT260

Develop an E2E Integration Scenario with SAP Gateway, SAP HANA, and SAPUI5

Hands-On Workshop (4hr)

Wed 8:00 AM - 12:00 PM
  Thur 2:15 PM - 6:15 PM

INT204

Cutting-Edge SAP API Management Leveraging REST

Lecture (1hr)

Wed 3:15 PM - 4:15 PM
  Thur 10:30 AM - 11:30 AM

EXP17713

Cutting-Edge SAP API Management and REST

Networking Session (30min)

Tue 2:30 PM - 3:00 PM

HANA CLOUD INTEGRATION

INT800

Road Map Q&A: SAP Process Orchestration

Product Road Map Q&A (1hr)

Tue 11:00 AM - 12:00 PM
  Wed 12:00 PM - 1:00 PM

INT160

Integration of SuccessFactors Applications Using SAP HANA Cloud Integration

Hands-On Workshop (2hr)

Tue 11:15 AM - 1:15 PM
  Tue 4:30 PM - 6:30 PM

INT802

Road Map Q&A: SAP Gateway – On Premise and in the Cloud

Product Road Map Q&A (1hr)

Tue 1:00 PM - 2:00 PM
  Wed 2:00 PM - 3:00 PM

EXP17711

Effectively Managing Change as SAP Applications Expand into the Cloud

Networking Session (30min)

Tue 1:30 PM - 2:00 PM

INT200

Cloud Integration – An Update on Our Strategy

Lecture (1hr)

Tue 2:00 PM - 3:00 PM
  Thur 9:15 AM - 10:15 AM

INT803

Road Map Q&A: Intelligent Business Operations

Product Road Map Q&A (1hr)

Tue 2:00 PM - 3:00 PM
  Wed 10:00 AM - 11:00 AM

INT801

Road Map Q&A: SAP HANA Cloud Integration

Product Road Map Q&A (1hr)

Tue 3:00 PM - 4:00 PM
  Wed 1:00 PM - 2:00 PM

INT104

Integrate with SAP Financial Services Network and Ariba

Lecture (1hr)

Wed 9:15 AM - 10:15 AM
  Thur 5:45 PM - 6:45 PM

INT262

OData in SAP HANA Cloud Integration

Hands-On Workshop (2hr)

Wed 10:30 AM - 12:30 PM
  Thur 8:00 AM - 10:00 AM

INT161

Provide and Discover Content with SAP HANA Cloud Integration

Hands-On Workshop (2hr)

Wed 2:00 PM - 4:00 PM
  Thur 10:30 AM - 11:30 AM

INT101

Integrate Existing Systems and Replicate Data to SAP HANA Cloud Platform

Lecture (1hr)

Thur 3:15 PM - 4:15 PM
  Wed 4:30 PM -5:30 PM

POWER DESIGNER

DMM208

An Applied Enterprise Architecture Approach with SAP PowerDesigner

Lecture (1hr)

Tue 4:30 PM - 5:30 PM
  Thur 9:15 AM -10:15 AM

DMM810

Road Map Q&A: SAP PowerDesigner Future Directions

Product Road Map Q&A (1hr)

Tue 6:00 PM - 7:00 PM
  Thur 12:00 PM -1:00 PM

DMM215

Looking for an ARIS Alternative?

Lecture (1hr)

Wed 8:00 AM - 9:00 AM
  Thur 4:30 PM -5:30 PM

DMM216

SAP PowerDesigner Future Directions

Lecture (1hr)

Wed 9:15 AM - 10:15 AM
  Thur 5:45 PM -6:45 PM

EXP17588

Evolving Your Data Warehouse for Big Data

Networking Session (30min)

Wed 10:00 AM - 10:30 AM

DMM226

SAP PowerDesigner Tips, Tricks, and Customizations

Lecture (1hr)

Wed 2:00 PM - 3:00 PM
  Fri 8:00 AM -9:00 AM

EA303

Using SAP PowerDesigner to Simplify and Expedite Data and Analytics Challenges

Lecture (1hr)

Wed 3:15 PM - 4:15 PM

DMM272

Making SAP PowerDesigner Work for You

Hands-On Workshop (2hr)

Wed 4:30 PM - 6:30 PM
  Fri 10:30 AM -12:30 PM


Process Observer: Working with Tasks Made Easy

$
0
0

Hi community,

 

In this new blog, I want to focus on the definition and the use of tasks in Process Observer and on how specific tools can make your life easier.

 

Task Definition


Let me start with the meta-model:

metamodel2.png

Tasks are representatives of activities performed by users on business objects in the back-end application. The information about the execution of BO activities is published via events by the back-end application. In the process definition, tasks are assigned to process activities (= Activity Binding). While process activities are defined only in the context of a process definition, tasks are independent, reusable entities.

 

Before tasks can be used they are defined as a combination of a Business Object Type and a Task Type in the process façade (transaction POC_FACADE ‘Maintain Objects in Façade Layer’).

task definition.png

You can define header level tasks and item level tasks. Please select the checkbox accordingly. By adding customer-defined Business Object Types and Task Types (with key Z…) you can also create new custom tasks.

 

In the case of direct [non-BOR] events being raised by the back-end application (see blog Process Observer (POB) Direct Event API for Logging Processes from SAP and Non-SAP Systems ) no further mapping is required between a direct event and a task. It is assumed that the direct event contains information about the predecessor BO that is then used for the binding of the process instance.

 

If you are using BOR events in your application, you need to provide additional mapping between the BOR event and a task defined in the process façade as defined above. In order to do this, you can :

 

1) Maintain the mapping in the Customizing transaction ‘Maintain BOR Instrumentation’

bor_task_mapping.png

or 2) Implement the BAdI ‘Mapping of BOR Events to Tasks’.

img_bor_task_mapping.png

For BOR events you may also need to consider adding information about the predecessor BO to the event (only the object itself is considered a predecessor and does not need to be added). You have the following options for defining the predecessor. The options are executed in the given order.


1) Implement BAdI ‘Previous BO Determination’

pre_bo_determination.png

2) Customizing: Map Previous Objects from Business Object Repository Payload

map_from_payload.png

3) (Do nothing but) use the built-in functionality of the Document Relationship Browser (DRB) function module (find additional information about the DRB  here )

 

While options 1) and 2) are recommended for productive environments, you may use option 3) for rapid prototyping, but you must be aware of potential performance issues that using the DRB module during runtime can cause.

After the tasks are received by Process Observer, you can further manipulate them before process definition mapping is performed. The following BAdIs are available:

  • Enrichment of Task Event Data (only for BOR events)
  • Enhance/Split Tasks (all events)

enrich task.png

Note: Up to now BOR Events and direct Events have been processed in separate jobs, one after the other, which may have some side-effects (see blog article Process Observer (POB) - working with BOR events made easy). When working with direct events for Process Observer and finding duplicate events in the log, please check that the same event is not also provided as a BOR event by the system as the default. If necessary, deactivate the BOR event by removing the mapping from BOR to task in transaction POC_BOR. You may also check the transaction_ID linked to the events (see blog article … direct event…), as the system normally filters out the duplicate events by comparing the transaction_IDs.

Now, with note 2030279, we have a report for unified processing of BOR events and direct events available. Events of all types are now processed together, and in the order in which the events  occur. In order to use unified event processing, you need so top the currently scheduled event processing report with SM37, the reschedule the event processing (transaction POC_JOB_SCHEDULER), and select the ‘Unified BOR-/non_BOR Event Processing’ checkbox.

scheduler.png

If you want to execute the new report manually, execute report POCR_PROCESS_EVENTQUEUE_UNI in transaction SE38.

 

Task Binding


In order to enable Process Observer to create process logs, tasks are assigned to process activities in Customizing transaction POC_MODEL (‘Create and Maintain Process Definition’). You may assign multiple tasks to the same process activity. The activity is logged when one of the assigned tasks is observed by POB.

You can modify the way tasks are assigned to a process activity:

1) In a BAdI or a BRFplus rule, you can further evaluate a detected task-activity assignment at runtime and discard the assignment (compare with the meta-model given above).

binding_rule.png


Using this concept, you may use the same task (for example ‘Create Object’) to bind the BO activity to different process definitions (variants of a process). Depending on an attribute, such as the object type, you may bind it to the one or the other process definitions when you check this in the relevant starting steps of the processes.


Use BAdI ‘Rule-Based Binding of Task to a Process Activity’ as an alternative to creating a BRFplus rule.

binding_badi.png

Note: Another approach to solve this use case is to use different tasks in the instrumentation of the application. In some cases this may be more efficient as it may avoid overhead processing, or the generation of orphaned tasks (see below).

 

2) You may also flag tasks assigned to activities as additional tasks. This means that the tasks are only considered as additional information. Logging of the activity only takes place if at least one non-additional (or main) task occurs. Additional tasks are logged and assigned in this context, but if only the additional task occurs, no activity is logged.


Also in finding the predecessor instance, additional tasks behave differently: they are not included in finding predecessors. (Note: If you really rely on this behavior, please apply note 1994183).
additional_task.png
You may use this feature in order to store additional information in the context of the activity, where the BO-ID in the additional task does not need to be specific for the process, but you can still use it in the extended search of the process monitor.


As an example you may have a change ID that represents the change of a material. You use the change object and ID in the main task. In an additional task, you could add the task executed with the material ID (not specific to the tracked process). In the monitor search you can find all change processes for a
given material ID.

 

Orphaned Tasks

 

Tasks not bound to logged process instances at runtime are called orphaned tasks.Together with bound tasks they are stored in table POC_D_BA_LOG. Orphaned tasks may occur due to:

  • Activation of logging on BO type level (all tasks in the context of a certain BO type are logged)
  • Extended logging of a process (not only tasks assigned to the active process definition are logged, but also all other tasks related to BO types referenced in the process definition. You normally use this setting only in testing environments.)
  • Discarding of process binding in the binding BAdI or BRFplus rule (see above)
  • Errors in the process definition or application instrumentation (for example, the predecessor BO is missing and so the binding of the task to a process instance does not take place.

 

So orphaned tasks may be of interest to you for detecting errors of process definition and instrumentation, especially at design time.

 

Task Monitor and Additional Tools


There are different transactions that you can use to review the logged tasks without entering via the process monitor. This is especially helpful in the implementation and testing time of your process definition and instrumentation to check whether events or tasks have successfully reached Process Observer, and whether they are bound to a process.

  • POC_TASK_ORPHAN: Lists orphaned task
  • POC_TASK: Lists all tasks with their assignments to processes and activities

task_monitor.png

While there is no general Process Observer setting to prevent the logging of orphaned tasks, we have provided a sample BAdI implementation that filters out orphaned tasks before writing the process logs to the DBs. To use this feature, please implement note 2018078.


The corrections from the notes mentioned in this document are also available with these SPs:

  • SP 13 for SAP Business Suite Foundation 7.31
  • SP 08 for SAP Business Suite Foundation 7.46

Detection and Handling of Stuck BPMN Request-Confirmation Pattern

$
0
0

Titel.png

Introduction

In this blog post we are taking a closer look at a category of problems happening during the communication between SAP Process Orchestration (including BPM) and back-end services. Based on our experiences, we recommend patterns, best practices and show an idea of a custom application able to detect and handle process instances that are affected from the typical symptoms.

What is the Request-Confirmation Pattern?

To retrieve data from a back-end service in a BPM process, one might want to use an automated activity modeled with a synchronous Web service interface (WSDL). However, synchronous Web service calls have the following drawbacks:

  • They come at a high cost since they are occupying the system resources (threads, transaction, memory, execution progress in BPM) while waiting for the back-end response which is being processed there.
  • Usually, it is not possible to utilize reliable exactly-once communication protocols such as WS-RM.

Therefore, it is recommended to implement communication requiring response from a back-end service with the help of a so-called request confirmation pattern. The automated activity ‘Request’ is called asynchronously and returns immediately with an empty response. Then, the process waits for a separate response message containing the requested data at the intermediate message event (IME) ‘Confirmation’:

01.png

Usually, the response message contains a correlation key (e.g. business key) which is used by the BPM system to determine which process instance is addressed by this particular response message.

 

How Can Process Instances Get Stuck In Back-End Communication?

If you are using this pattern in its simplistic variant, it does not care about response messages never reaching the waiting process. Such cases might occur due to various reasons:

  • Outbound communication issues when performing the automated activity
  • Errors during back-end service processing
  • Inbound communication issues when receiving the reply message

Issues in the outbound communication can be handled via the technical error handling on the modeling side of the automated activity, which is described in a separate blog post. Within this article we will address other cases where a process is stuck while awaiting the response on the IME.

In the ideal case the back-end service invoked is able to cope with certain error situations and would reply to the BPM process with an error message rather than with a regular business response. A BPM process developer could enhance the model by a decision gateway which analyzes the content of the confirmation message received:

02.png

Not in all cases the back-end service can be adapted so that it reliably sends either a regular confirmation or an error reply with an error code. In addition, depending on the guarantees the messaging infrastructure provides, the message could get stuck or lost between the communication partners. In those cases where the back-end response (if any) does not arrive at the BPM process, the process instance will be stuck.

A stuck BPM process instance cannot be easily observed, as e.g. in the SAP NetWeaver Administrator’s view ‘Manage Processes’ the process still remains in status ‘In Progress’ because there is no real technical issue with the process instance. The process instance is logically stuck, not technically, which makes it necessary to check the processes’ details in order to find such cases. In systems with a low number of process instances you could check the back end for the individual case. But it can become a much more work-intense issue in Process Orchestration landscapes where a huge number of processes is executed, each with multiple back-end interactions.

 

Automatic Self-Detection of Stuck Process Instances with BPMN Means

In order to automatically detect such logically stuck process instances, the BPM flow can be enhanced by a timeout branch. You can implement this by using an ‘event based choice’ gateway and a timer event in addition to the ‘Confirmation’ IME:

03.png

The semantics of the event based choice gateway is as follows: whatever comes first (the confirmation message of the back end or the elapsed timeout) will continue the flow. In regular cases, the confirmation message arrives before the timeout and continues the flow in a regular manner. In case the confirmation does not arrive in time, the timeout timer event is triggered and continues the exception branch.

The value for the timeout in the timer event determines after how much time the process instance is considered to be affected by exceeding the expected process time in the back end. It needs to be defined according to the expected response time of the back-end service, perhaps it even needs to be adapted over time. In case the timeout is chosen too low, it might produce false-positive cases; if it is defined too high, it might take additional time until a process instance is detected as being affected. At the outgoing edge of the timeout timer event you need to model an exception handling procedure.

There is no generic solution for all failure situations you might think of. Your business scenario should be considered when defining the error handling. Do you have more ‘human centric’ processes, small or medium amount of instances per day? In such cases a complex mitigation might not be required. You could simply send an e-mail notification to the process administrator so that he or she can validate the concerned process instance and trigger the necessary correction steps.

SAP BPM supports system centric scenarios. Such processes are usually ones with limited human interaction. They are characterized by automated activities, IMEs and mapping activities. System centric processes are often executed as high volume scenarios. Single case notifications and individual error handling might not be feasible in such a scenario. The error handling and correction steps need to support a high amount of affected process instances.

 

Mass-Enabled Handling of Stuck Process Instances

In our scenario, the assumed reason for a high number of affected process instances is a failing back-end operation due to incorrect data originated from the BPM process. This might be caused by an update of master data or general keys, which are not reflected in the BPM process. Such cases require a selection of process instances based on their context data to change it for multiple processes in the same way. The procedure could look as follows:

04.png

 

Here is one example implementation for a data correction process.

05.png

In order to separate the exception handling branch from the remaining process, we put everything related to the ‘data correction’ into an embedded sub process.

The following block diagram shows the interaction of the data correction process getting triggered in case the back-end response does not arrive in time:

06.png

(Only) for processes reaching the timeout, the human activity ‘Store Data’ writes the context data (field names and values) from the process context into custom data store. This could be a simple table with the following columns:

 

Primary KeyProcess Instance IDData NameData Value
190c0a367996b11e3b49400000034b6ba/salesorder/lineitem/orderno4711
290c0a367996b11e3b49400000034b6ba/salesorder/lineitem/productNamebread
3000c0de3028e72000c0de3028e720000/salesorder/lineitem/orderno0815
4.........

 

As the global process context from the main process is also available in the embedded sub process, it can simply be accessed by the ‘Store Data’ mapping activity. The mapping activity calls a simple custom-developed mapping EJB to write the selected context values into the custom data store. Note that the implementation of the mapping EJB needs to be idempotent as for reasons of error recovery the BPM runtime might execute the given activity for the same instance several times.

The next step in BPM is the human activity ‘Access Data’. The process will create the human task and will wait for input, i.e. task completion. The system administrator could get notified by an email, so he will query for affected instances on the data persisted in the custom store utilizing an SAP UI5 application. Such an application can be built with regular Java means to query for the affected process instances based on the stored data values. Such an application is shown by the following screenshot containing example data as follows: The product ‘Schwarzbrot’ was specified in German language. But the back end only knows the English version. The proposal is to change the affected data to the English word ‘bread’ for all detected instances with product ‘Schwarzbrot’.

07.png

Based on the investigations, the administrator can decide to change the data so that the back-end service should be able to deal with it correctly. Before changing all affected process instances’ data, using the task UI the data can be modified for just one process instance by providing the corrected data in the task UI and then completing the task manually to verify the success of the changes.

08.png

In case huge amounts of processes with the same problem pattern need to be corrected, the custom SAP UI5 application can loop over all relevant tasks via BPM OData Task Services to complete tasks with the new data value(s). In both cases, the output mapping of the human activity updates the BPM process context with the new corrected value(s).

 

How Should the Treated Process Continue?

After the data correction step is executed, there are various options to continue in the BPM process:

  • You might want to send the request with the corrected data to the back end again.
  • You might want to start a compensation process from beginning and terminate the existing process.
  • In case you didn’t perform any data modification in BPM, but the issue got solved on back-end side, you might again want to wait for the response message.

Modeled in BPMN, this could look like the picture below:

09.png

An exclusive choice gateway is inserted after the ‘data correction’ sub flow to determine which path to take. Not all the options might be required for all situations.

 

Conclusion

This blog post provided some ideas how to detect and handle processes stuck due to a communication or processing error in a request-confirmation pattern. The ideas are based on BPMN means with a few lines of Java code (like a mapping EJB or a simple UI which visualizes the content of a custom database table). Depending on different scenario-specific requirements, the example could be adapted and extended accordingly.

Using BPM OData Service for Accessing Value Help in Java

$
0
0

We started using the new Help Value Services and I think it's great as you can directly use the ABAP value help and don't need to wrap it within an extra web service or similar work arounds.

 

Here are some useful links regarding this topic:

 

However, I had little trouble using filters.

There is no problem using only one filter - you can go the way descibed in the SAP Help. But when I was supposed to set multiple filter, I was not able to find an example and it is not straight forward as it is not the default OData filter syntax. The solution: you have to provide the filters in the url semicolon separated:

 

http://<host>:<port>/bpmodata/valuehelp.svc/<AliasName>/HelpValues? filter=<filterAliasName>:<filter value>,operation(EQ|CP|LT|NE|LT|LE|GE|GT);<filterAliasName>:<filter value>,operation(EQ|CP|LT|NE|LT|LE|GE|GT);<filterAliasName>:<filter value>,operation(EQ|CP|LT|NE|LT|LE|GE|GT)

Process Mining with Process Observer and Fluxicon Disco

$
0
0

Dear Process Observer community,

 

In one of our previous posts, we showed you how to extract process log information from Process Observer in the MXML format as well as how to then import and analyze it with ProM. While ProM is a free open source tool, which requires a certain level of expertise to be able to use it, we alternatively used Fluxico’s Disco. Fluxicon is a spin-off of the University of Eindhoven (which developed ProM) and their product Disco is a professional variant of the process mining tool, being more user-friendly and so more suitable for day-by-day use in business.

 

The good thing about using Disco is that it can use and import the Process Observer data extraction in the MXML format that we generated for the ProM case. So all you need to do is to install notes 1832016 and 2011730– as described in the previous article, and run the extraction report. Disco can then import the generated file imported. Click ‘Open file’ to import the exported MXML file from the local PC:

disco_open_file.png

In the example shown, we extracted Process Observer data from a process orchestrated with Process Object built by SAP Process Object Builder (aka “POL”). Process Objects are used to orchestrate service-based processes, such as “loan” processes in Banking. For more information about the integration of Process Objects with Process Observer, see Logging & Tracing.

 

The scenario being discussed is a “mobile loan” scenario. It allows customers to buy goods in retail stores and to pay with a loan. Process Object Builder calls the services needed for executing this process.

 

pol_process.png


The complete (click-through) demonstration can be found here: http://demo.tdc.sap.com/SpeedDemo/694212576a084610/

 

The process is executed in phases:

  • “Start”
  • “Create”: Retrieve customer data, Retrieve limit
  • “Check”: Check limit
  • “Execute”: Create loan, and so on

 

A positive creation and execution ends with a confirmation, problems during execution result in a failed state. Inconsistent data (such as a limit issue) may lead to an error. If the decision is to cancel the process, compensation might be needed before it ends in a cancelled state.

 

After the import, Disco shows the process flow including the error situations:

mining_overview.png

Note that the shown process map is simplified by not showing some of the less frequent paths in the flow. The level of detail of the shown process map can be adjusted interactively in the software.

 

In addition to the absolute frequency of the steps and paths, the tool shows the performance of the execution, highlighting execution bottlenecks. For example, in the following illustration, a more detailed process map is shown with the cumulative times between the state changes for all cases (visual mark-up and leading metric) and the average times between them (in a smaller font). 

performance_view.png

Filtering allows you to set the focus to certain process variants, or processes, in which certain steps occur (for example only processes with errors) and finally to drill down to the level of single cases for further analysis:

instance_view.png

To evaluate the process further on the service call level, we have created a specific extraction report, exposing the “callable business entities” logged in Process Observer as steps. The resulting data allows for the analysis of the process data on a deeper level, for example, allowing you to evaluate the performance of processes with errors:

analysis_view.png

For us, it was surprising to see how easily we could evaluate our process data. While we did this all on test data in our own systems, we are now in the phase of evaluating this approach with a real customer.

 

Try evaluating your own scenario with Process Observer and Disco. You may download a demo version of Disco from the Fluxicon website: http://fluxicon.com/disco/ and request an extended evaluation license (to import larger files) from anne@fluxicon.com. A Disco project file (containing the different views from this article) that can be imported with the demo version can be downloaded here.

 

To gain deeper insight into Process Objects and Process Object Builder, see http://help.sap.com/pobuilder10 or contact Karsten Egetoft.

 

Stay tuned for more Process Observer use cases!

 

Bernd

Process over Product! What was I thinking?

$
0
0

Hello All,

 

A few months ago, I had blogged about the importance of process over product. Product over Process  . I believed strongly in the importance of having a process and adhering to it strictly at all times. In the blog, I put the blame squarely on the managers who seemed to push for the end product and ignore the process conveniently. As a developer, my sympathies was with all those developers whom I projected as the noblest of all. Sense had to knocked into my head and it happened today.

 

I visited a gadget store for purchase of washing machines. I had carried along a shopping bag while visiting the store. On our return back home, we just realized that the shopping bag was left back in the store. The store was closed on our return. We decided to visit back the next day and collect it. I was in for a shock. When we went to store today the store manager claimed that the bag was never collected. I was outraged and vented my anger at the staff. The manager stated that they have a process and they will be unable to help much as we had no token for the bag. I insisted that process is to accomplish a purpose and not to deceive it. I argued vehemently for the manager to break the process and assist me in checking the cctv footage, the purpose of customer satisfaction. The manager kept stressing the process and how it was sacrosanct. I was livid. But, I was one who had earlier vouched for adherence to process over product. What was I thinking?

 

As a customer, I never cared for the process. My sole intention was the resolution to my problems. I was fixated with the end product/purpose to be served. Process to me, as customer was only an enabler in fulfilling my purpose. Is it not the same with the customers at my work? As I was part of the team which served the customer, I failed to understand the emotion of the customer. The importance of the end-product to the customer over the underlying process, came across as customer's lack of understanding. Now that I was in the shoes of the customer, I could feel the frustration when the same process was thrust over me, instead of resolving the issue.

 

Perspective. Every argument, every situation, every viewpoint depends on the perspective in which it is seen. We should never dis-regard a view-point as it could be of utmost value from that person's perspective. Experience teaches that no one is always right and every rule has an exception.

 

So, with this personal experience I decided to change my viewpoint and give the importance to product over process. Well there was a twist to the tale.

 

As stated earlier, when I requested to view the CCTV footage the manager clearly denied it. The manager inspected and claimed that the footage showed that we had not carried it into the store. My anger boiled over and asked permission to view my footage. Again the process stops me from accessing the footage without the legal complaint. I had to leave the store and prior to going to the police for help, decided to check with the nearby store. Its safe to guess that, I had left the bag in the previous store. It was in the earlier store where the security did not give us a token for the bag. Process was not followed in the store where the bag was actually left, as the security failed to hand over the token. But, my purpose was served when I visited that store. The retail store which stuck to process made me angry and frustrated. I felt that my purpose was not served by the manger in the retail store.

 

Process still triumphs over the product but perspective and emotion impacts our decisions. Its definitely a rule(process over product) which I still vouch but with the rider that product to customer holds the complete value. Respect the perspective - the developer's, the manager's and the customer's.

 

Regards,

Sharath


SAPUI5 Integration into SAP BPM Made Easy

$
0
0

SAPUI5 Integration into SAP BPM Made Easy


Introduction: Previously, integrating to custom UIs into SAP BPM involved the use of the BPM Java API. The ease of integrating custom UIs into SAP BPM offers more possibilities with the introduction of the BPM Odata. Integrating SAPUI5 into a BPM process has become easier with the introduction of the BPM Odata. This article provides a detailed insight on how to execute a human task from a SAP BPM process using the SAPUI5 technology.


Prerequisites: NWDS 7.31 SP13 and above

Generating a SAPUI5 DC from your SAP BPM

After creating a sample BPM process as shown below, perform the following actions to generate the SAPUI5 user interface from within the process:

  1. a) Create or import an XSD Schema to be assigned to the data object that will be used to hold the process’s data. In this article, a service interface from SAP PO’s Enterprise Service Repository was imported and assigned to the process.
  2. b) Select the human activity “Approve” from the process. In the property section and the Task sub tab; choose the “new” option from the task attribute as indicated by Figure 1.

SAPUI5_1.jpg

                                        Figure 1: Creating a new SAPUI5 application from SAP BPM.

c)      A new wizard will start to facilitate the generation of the SAPUI5. Follow the wizard to create a development component for SAPUI5. Name the task as shown in Figure 2.

SAPUI5_2.jpg

                              Figure 2: Naming the SAPUI5 Task.

d)      For the attribute UI Technology, select the value “SAPUI5” from the dropdown. In case a development component needs to be created, choose the option new and follow wizard to create a new DC. See Figure 3.

 

SAPUI5_3.jpg

                              Figure 3: Selecting the SAPUI5 Technology.

e)      Follow the wizard until the last step to select the Data Object to be used as the basis to generate the SAPUI5 user interface. Refer to Figure 4.

SAPUI5_4.jpg

               Figure 4: Process Context details for generating the SAPUI5 application.

f)      Once you click on the finish button, the SAPUI5 technology generates its own data type from the original data object within the process context. The result is presented in Figure 5.

SAPUI5_5.jpg

Figure 5: Custom SAPUI5 Data types generated.

g)      Open the task folder under the SAP BPM project tree and select the role tab. From there select user(s) from the UME to be assigned to this task as potential owners. See Figure 6.

SAPUI5_6.jpg

 

                              Figure 6: Assigning task potential owner to SAPUI5 for task execution.

h)      From the NWDS tool, switch to the “Web” perspective to view all the files which have been generated for the SAPUI5 DC. Navigate the DC project structure and notice the generated folders, libraries, JSON and index files. An impression of the project tree structure is shown in Figure 7.

SAPUI5_7.jpg

 

                         Figure 7: JavaScript code generated for the SAPUI5  controller

 

i)      The created SAPUI5 DC will need to be deployed to the SAP PO server as an enterprise archive file. You need to create an EAR DC and add the SAPUI5 as a dependency to it. From Figure 7, you can see an EAR DC named “dc_sapui5_aer”. Figure 8 demonstrates how the dependency to the “dc_sapui5” is created.

SAPUI5_8.jpg

 

                    Figure 8: EAR DC created its dependency to the SAPUI5 DC.

 

Note: Extra steps need to be taken to ensure that the human task notifications of the BPM are sent to the BPM Inbox. In SAP PO, by default the value for the notificationTask inbox is set to the Universal Worklist (UWL). However, in case you would like the notification to be consumed by the “BPM inbox”, this value will need to be set to “bpm_inbox” instead.

 

j)      Configure the “notificationTask” inbox settings by navigating through Java System properties of the NetWeaver Administrator (NWA) and set the value - as shown by Figure 9.

SAPUI5_9.jpg

 

               Figure 9: NWA settings for notificationTask from uwl to bpm_inbox

k) It is finally time to build and deploy the BPM process and the EAR DCs. Building and deploying the EAR DC will automatically perform the same actions for the SAPUI5 DC because of the dependency that exists between them and that we previously defined. 

 

After the DCs have been deployed, the BPM process can be started from the Process Repository application in the NWA.

The steps described above have highlighted the activities needed to generate a SAPUI5 application from within a SAP BPM process context. In the next steps, we will be performing a test to complete a task from our SAPUI5 application using BPM Inbox. Once an employee triggers an order request from a web client, this data is passed to the BPM process. The manager can then view and complete the task which has been assigned to him/her via the BPM inbox.

 

Note: The URL of the BPM Inbox to view and complete the task to be consumed by a SAPUI5 application is  http://<hostname>:<port>/bpminbox

The following roles should be assigned to a user to be able to access the bpm inbox:

1. UnifiedInboxUserRole: this role allows the user to view the list of tasks and their details in the BPM inbox.

2. com.sap.bpem.Enduser: enables users to manage and work on the task.

 

Proceed to the bpm inbox to claim the employee task and fill in the details. Once the employee task is completed, you are able to claim the manager’s task and complete it as shown in Figure 10.

SAPUI5_10.jpg

 

                         Figure 10: BPM Inbox tasks for SAPUI5..

Finally, to complete the task from the SAPUI5 application, open a web browser and enter the URL pattern described above. If the application has been properly deployed by the EAR file, it should be accessible via the browser.

SAPUI5_11.jpg

               Figure 11: Completing a task from the SAPUI5 application via the web browser.

The manager can preview the details filled or change any of the previous values passed into the process by the employee, then complete the task.

In the previous paragraphs, we have been able to discuss the basic steps involved in executing a human task within a SAP BPM process using the SAPUI5 technology. Note that the SAPUI5 application generated within the process context can be modified and extended to meet the business and functional requirements desired.

 

Conclusion: The blog has been able to explicitly highlight the necessary steps to integrate SAP UI5 into a SAP BPM process, so that tasks can be completed from different devices.

 

Written by: Abidemi Olatunbosun

BPM OData: Administrative Process UI

$
0
0

This blog post, as part 6 of BPM OData blog series, refers to the OData Service in SAP BPM available with SAP NetWeaver 7.3 EHP 1 SP 09 and higher. The features described in this blog post are available with SAP NetWeaver 7.3 EHP 1 SP 14 and higher. Before reading this blog post, it is recommended to read the previous parts of BPM OData blog series.

 

Overview

This blog post as well as all the previous blog posts about the BPM OData service is based on the business scenario in a credit institution. Previously, the scenario was considered from the point of view of the different participants. We were considering it from the point of view of the employees of the credit institution when we were talking about implementation of a task UI as well as from the point of view of the customer who submitted his data in the blog post about a process start UI. In this blog post, the scenario will be considered from the point of view of the department manager of the credit institution. One of the responsibilities of such a manager is to monitor and manage all the activities in his department. In our case, the BPM process containing the tasks to verify the customer data and to provide a credit limit can be considered as an example of the activity. Therefore, monitoring and management of the activities becomes monitoring and management of the processes. This blog post describes how to build an SAPUI5 application for such an administrative UI.

 

BPM Processes OData Service

Starting from SAP NetWeaver 7.3 EHP 1 SP 14 the BPM OData service provides functionality to access a collection of processes and to perform operations on a process. For that purpose, BPM Processes OData service should be used. This service as well as all the BPM OData services is available under ‘bpmodata’ root URL and has the ‘processes.svc’ name. The set of operations provided by the service includes the following:

  • Access to a collection of BPM processes
  • Suspend a process
  • Resume a process
  • Cancel a process

 

The service URLs can be represented using the following pattern:

http://<host>:<port>/bpmodata/processes.svc/<OData_resource_path_and_query_options>

 

Accessing a Collection of Processes

One of the responsibilities of the department manager in the business scenario is to monitor processes in his department. In scope of this activity, the manager should be able to see the processes including the information about the process time line, status, and the person who started the process. To represent information about a process, the BPM Processes OData service provides ProcessInstance entity type which contains all the required information. The metadata for the entity type is shown below:

 

<EntityType Name="ProcessInstance">    <Key>        <PropertyRef Name="InstanceId"/>    </Key>    <Property Name="InstanceId" Type="Edm.String" Nullable="false"/>    <Property Name="Name" Type="Edm.String" Nullable="false"/>    <Property Name="Subject" Type="Edm.String" Nullable="true"/>    <Property Name="StartDate" Type="Edm.DateTimeOffset" Nullable="false"/>    <Property Name="EndDate" Type="Edm.DateTimeOffset" Nullable="true"/>    <Property Name="ModelId" Type="Edm.String" Nullable="false"/>    <Property Name="DefinitionId" Type="Edm.String" Nullable="false"/>    <Property Name="Status" Type="Edm.String" Nullable="false"/>    <Property Name="ParentProcessInstanceId" Type="Edm.String" Nullable="true"/>    <Property Name="RootProcessInstanceId" Type="Edm.String" Nullable="true"/>    <Property Name="ProcessInitiatorName" Type="Edm.String" Nullable="true"/></EntityType>

 

In order to get a list of processes from the OData service, ProcessCollection entity set is used, which represents a collection of ProcessInstance entities.

More information about the service entity model can be found in the official documentation.

 

Obviously, a credit institution can have lots of processes. Some of them can be completed because a credit limit has been provided for the customer. Others can be canceled because the customer decided not to take a credit. Showing such processes to the manager makes no sense at all and makes the monitoring more difficult. Moreover, provisioning of all the processes via the OData service takes a lot of processing time on the server. Such a meaningless server load keeps the manager waiting. Which is definitely not what is expected. Based on the previous blog posts, some of you probably guessed that everything goes to the usage of $filter OData query option to limit the number of the returned entities and reduce the workload on the server. With the help of $filter query option the manager can specify which processes he would like to see and not to wait until all the processes are provided by the OData service.

 

In the BPM Processes OData service, every request to ProcessCollection entity set must have $filter query option in the request URL.

 

The table below shows the URL used to access a collection of processes along with the service response:

HTTP MethodGET
URL…/bpmodata/processes.svc/ProcessCollection?$filter=Status eq 'IN_PROGRESS' or Status eq 'SUSPENDED'&$format=json

Response Body

(simplified)

{    "d": {        "results": [            {                "InstanceId": "ffef91d67b8a11e4c1f300000034b6ba",                "Name": "CreateCustomerProcess",                "Subject": "Customer creation process",                "StartDate": "/Date(1417679765790)/",                "EndDate": null,                "ModelId": "e007956a10a6daea576611e3c5c62c413890274c",                "DefinitionId": "6c09936a310d118017f49889dfe194ec",                "Status": "IN_PROGRESS",                "ParentProcessInstanceId": null,                "RootProcessInstanceId": null,                "ProcessInitiatorName": "John_Smith"            }        ]    }
}

 

In the sample URL, $filter query option is used to retrieve all the running and suspended processes. Besides the $filter query option, $top and $skip query options are also supported for the ProcessCollection. More information about supported URLs for the ProcessCollection entity set and about the usage of $filter query option for it can be found in the official documentation.

Of course, the department manager is only interested in the processes which are running in his department. Therefore, the service response provides processes for which the current user is the process administrator. As a result, the manager can see only the processes which he is allowed to see and the ones he can manage, i.e. suspend, resume or cancel.

 

Performing Operations on a Process

Besides providing access to a collection of processes, the BPM Processes OData service also allows to suspend, resume or cancel a process. In our business scenario, the process can be suspended if an additional information should be provided by the customer and resumed once the required information is provided. The process can be canceled if a customer changed his mind and decided not to take a credit.

 

All the mentioned process actions are implemented as Suspend, Resume and Cancel function imports in the BPM Processes OData service. Each of the function imports works in a similar way requiring identifier of the process instance as the only input parameter and providing suspended, resumed or canceled process instances in the service response respectively.

 

The table below shows the URL used to suspend a process along with the service response:

HTTP MethodPOST
URL…/bpmodata/processes.svc/Suspend?InstanceId='ffef91d67b8a11e4c1f300000034b6ba'
Request HeadersAuthorizationBasic dXNlcm5hbWU6cGFzc3dvcmQ=
X-CSRF-Token781057a9-b96a-468c-b393-981f98292335
Acceptapplication/json
Response Body
{    "d": {            "InstanceId": "ffef91d67b8a11e4c1f300000034b6ba",            "Name": "CreateCustomerProcess",            "Subject": "Customer creation process",            "StartDate": "/Date(1417679765790)/",            "EndDate": null,            "ModelId": "e007956a10a6daea576611e3c5c62c413890274c",            "DefinitionId": "6c09936a310d118017f49889dfe194ec",            "Status": "SUSPENDED",            "ParentProcessInstanceId": null,            "RootProcessInstanceId": null,            "ProcessInitiatorName": "John_Smith"    }
}

 

In order to suspend, resume or cancel a process, the service user should be the process administrator.

 

Implementing the UI

Knowing the details about the consumption of the BPM Processes OData service the next step is to implement an administrative UI via the service. As it was mentioned before, such a UI should allow the department manager to monitor and manage the processes in his department. In this blog post, the following UI will be used as a sample:

 

admin_UI.png

 

In general, the administrative UI for the department manager looks pretty similar to the task inbox UI which was described in one of the previous blog posts. Besides the visual similarity, the implementation of the UIs is also quite similar. Therefore, it is highly recommended to read the blog post about the implementation of the custom task inbox UI before continuing reading, here. In the following sections only the differences between the both implementations will be described.

 

Implementing a UI for Accessing a Collection of Processes

Implementation of almost every SAPUI5 application for OData services starts with the creation of the service-specific ODataModel instance, and the administrative UI for the BPM Processes OData service is not an exception. The code snippet below shows the definition of the ODataModel in onInit() function of the SAPUI5 controller for the administrative UI:

 

onInit : function() {    var processesServicePath = "/bpmodata/processes.svc/";    var processesODataModel = new sap.ui.model.odata.ODataModel(processesServicePath, true);    processesODataModel.setDefaultCountMode(sap.ui.model.odata.CountMode.None);    processesODataModel.setDefaultBindingMode(sap.ui.model.BindingMode.OneWay);    this.getView().setModel(processesODataModel);
}

 

After the model is defined and set to the view, it is time to configure UI bindings. In our case, we use a table to show the processes on the UI. Therefore, the bindings should be specified for the table columns:

 

var processesTable = new sap.ui.table.Table({title : "Credit department processes" });
// define column bindings
processesTable.addColumn(new sap.ui.table.Column({    label : new sap.ui.commons.Label({text : "Name"}),    template : new sap.ui.commons.TextView({text : "{Name}"})
}));
processesTable.addColumn(new sap.ui.table.Column({    label : new sap.ui.commons.Label({text : "Subject"}),    template : new sap.ui.commons.TextView({text : "{Subject}"})
}));
processesTable.addColumn(new sap.ui.table.Column({    label : new sap.ui.commons.Label({text : "Status"}),    template : new sap.ui.commons.TextView({text : "{Status}"})
}));
processesTable.addColumn(new sap.ui.table.Column({    label : new sap.ui.commons.Label({text : "Started at"}),    template : new sap.ui.commons.TextView({text : {        path : "StartDate",        formatter : function(date) {            if (date) {                var dateFormatter = sap.ui.core.format.DateFormat.getDateTimeInstance({style : "medium"});                return dateFormatter.format(date);            }            return "";        }    }})
}));
processesTable.addColumn(new sap.ui.table.Column({    label : new sap.ui.commons.Label({text : "Started by"}),    template : new sap.ui.commons.TextView({text : "{ProcessInitiatorName}"})
}));

Having bindings for the table columns does not mean that the table will be populated with the data. Moreover, in the controller, we only set the model to the view but we have not bound the table to a property in the model representing the processes.

 

As it was mentioned before, to get a list of processes from the BPM Processes OData service a request should be sent to the ProcessCollection entity set. It was also mentioned that all the requests to the entity set must contain $filter query option in the request URL. Exactly the same situation was described in the blog post about implementing a custom task inbox UI, where it was necessary to get tasks from the TaskCollection entity set of the BPM Tasks OData service. In that blog post, we used SAPUI5 Filter objects to define value for $filter query option. After that, the created filters were used for the aggregation binding for the UI table with the tasks. In this blog post, we have to do exactly the same, adjusting the filters and the binding path according to the BPM Processes OData service.

 

First of all, let us define the filters. Of course, the department manager is interested in the processes which his subordinates are working on. Therefore, it is necessary to define the filters to show only running or suspended processes:

 

var inProgressStatus = new sap.ui.model.Filter("Status", sap.ui.model.FilterOperator.EQ, "IN_PROGRESS");
var suspendedStatus = new sap.ui.model.Filter("Status", sap.ui.model.FilterOperator.EQ, "SUSPENDED");

 

Having the filters, the next step is to define the aggregation binding for the UI table to ensure that the created filters will be used as a value of $filter query option when the binding is resolved by the ODataModel, i.e. when a request is sent to the OData service.

 

processesTable.bindRows("/ProcessCollection", null, null, [inProgressStatus, suspendedStatus]);

 

Such an aggregation binding will be resolved by the underlying ODataModel by sending the following request to the BPM Processes OData service:

 

…/bpmodata/processes.svc/ProcessCollection?$filter=Status eq 'IN_PROGRESS' or Status eq 'SUSPENDED'

 

That's it! Now, we have the table with the processes on the UI.

 

The entire implementation of the SAPUI5 view for the sample application can be found in the attached processAdmin.txt.zip file. In the sample application, static filtering is implemented, meaning that the $filter expressions are ‘hardcoded’ in the SAPUI5 view implementation. Dynamic filtering can also be implemented by adding a number of additional UI controls to specify the filter criteria, selecting a property by which the processes should be filtered (e.g. Status) and the filtering value (e.g. IN_PROGRESS).

 

Implementing a UI for Process Actions

In the previous section, we populated the table on the UI with the processes provided by the OData service. Now, the department manager can monitor the processes in his department. But he is a manager, meaning that he also wants to manage the processes that he can see. For that purpose, we have Suspend, Resume and Cancel buttons in the table toolbar. Suspend, Resume and Cancel process actions are represented as the corresponding function imports in the BPM Processes OData service, therefore, clicking on each of these buttons should result in sending the corresponding requests to the service. The similar situation has already been described in the blog post about implementing a custom task inbox UI. In that case, it was necessary to call the function import in order to release a task. In our case, the situation is pretty similar except of the names of the function imports and the names of their input parameters.

 

As usual, a JavaScript function which calls a function import should be implemented in the controller. Because the function imports for process actions are similar to each other, we can handle ‘press’ events for the UI buttons using one generic JavaScript function:

 

executeProcessAction : function(actionName, processInstanceId) {    if (!processInstanceId || !actionName) {        return;    }    // define messages to be shown in case of success and error    var successMessage = this.getSuccessMessage(actionName);    var errorMessage = actionName + " action failed.";    //define function import parameters    var functionParameters = {};    functionParameters.method = "POST";    functionParameters.urlParameters = { "InstanceId" : processInstanceId };    functionParameters.success = function() {        alert(successMessage);    };    functionParameters.error = function() {        alert(errorMessage);    };    var processesODataModel = this.getView().getModel();    // call the required function import    processesODataModel.callFunction(actionName, functionParameters);
}

 

The function takes two parameters: name of the action which should be performed and the identifier of the process instance on which the action should be performed. Technical details of calling OData function imports from SAPUI5 have been described in the blog post about implementation of a custom task inbox UI.

Full implementation of the SAPUI5 controller can be found in the attached processAdmin.txt.zip file.

 

Conclusion

This blog post has shown the implementation of the administrative UI for BPM processes using the BPM OData service. In this part, technical details of the BPM Processes OData service have been described. It was also shown how to consume the service from SAPUI5 and how to build a sample UI application based on it.

The attached processAdmin.txt.zip file contains the implementation of the administrative UI described in this blog post. The archive contains the implementation of both, SAPUI5 view and controller for the sample application. The attached implementation can be used to ‘play’ with the UI and is not intended to be used for production purposes.

How to setup Process Analytics? Part 1/3 - How to connect Galaxy to BI?

$
0
0

Many customers have asked about the exact steps to setup process analytics in BPM. To complement help and to avoid incident creation I will provide here the steps to do it. We have divided them in 2 steps:

  1. First step: Create a source system in BI. How to connect Galaxy to BI? (this document)
  2. Second step: Create data sources.

 

After following above documentation you will be able to setup process analytics in BPM.

How to connect Galaxy to BI?

Prerequisites

First, the process analytics components have to be installed. They get deployed with the BPEM- SCA. Secondly, you need a BI system.

Create your user account in the BI system

You need first a user in order to access the BI system. The systems dispose of some pre-defined users for accessing them but the permissions assigned to them are restricted and those users cannot be used to perform the operations required to setup a connection. The screenshots below describe the steps for creating rapidly a user in a BI system:
fig_1.jpg
  1. Login to the BI system with the default user
  2. Run the transaction SU01.
  3. Enter the name of an existing user that will taken as reference to create your user account.
  4. Click on the Copy button in the toolbar.

fig_2.jpg
  1. Enter in the To field the login ID of your new user. Keep the selection of the checkboxes as it is.
  2. Click on the Copy button in the bottom of the window to create the user.
It might happens that the reference user has some profiles or permissions which cannot be copied to your user. You will then see a            screen notifying  you which element could not be copied in your new user account. Simply ignore the message and continue.
  3.  Fill the missing user information, especially the credentials in the login tab, and save your changes. Log off and login to the BI system with your new       user account. It might request the first time to change the credentials.

Request rights for BI user

The user you created might not have all the permissions required to setup a connection. The following steps explain how to check the permissions assigned to the user and how to request for the missing ones.
Tips for permissions and profiles
To know exactly which permissions/profiles you need to request, use the transaction SU53. The transaction lists all the profiles your user does not have but which were required for the operations/transactions you tried to run.

fig_3.jpg
  1. Login to the BI system with your user account and run the transaction SU01.
  2. Enter the login of your user in the User field and press the F7 key or the Display icon in the toolbar.
  3. Go into the Profiles tab and check if the profile S_RS_ALL is in the list. If it is not, then open an IT/BC ticket on the component DEV-BICONT-USER and
    request that the profile
    S_RS_ALL be added to your user account.
fig_4.jpg

Configure NetWeaver/CE to connect to the BI system

The connection between Galaxy and the BI system is done through the gateway of the BI system. Galaxy should first connect to the gateway and register this connection.
fig_5.jpg
  1. Enter SapJavaResourceAdapter15 in the resource name filter.
  2. Select the resource from type Resource Adapter with the name SapJavaResourceAdapter15.
          Do not select the resource from type JCA Resource with the same name!
   3. Select the Properties tab.
   4. Fill the missing informations and update the others like in the screenshot:
    • The ProgramID is used to identify the connection between Galaxy and the gateway of the BI system.
                           Keep this name in mind because it will be required when you will configure in the next step the TCP/IP connection in the BI system.
    • The MaxReaderThreadCount property can be set to a small number 2-5 (best pratice till now). The number could be calculated knowing the amout of RFC (pull) requests which might arrive from BW side in the same timeframe - the more InfoProviders scheduled to run in parrallel in BW the,more JCo server threads should be configured for accepting these requests. The reason is that the execution within BPM might cause some of the RFC requests to be denied by the JCo server resulting in errors at BW side.
    • The SAPClient is the client ID used to login to the BI system. Usually, the client ID is 001 but it is not systematic. For example Q76 is      accessed with the client ID 003. Normally the system indicates which client ID should be used in the login page.
    • The UserName should be the login ID of the user you created in the BI system.
    • The ServerName property should be set with the hostname of the BI system gateway also known as message server. If you chose a system from the table above, the hostname of the gateway is given in the table. Otherwise, you can determine it with the SAP Logon UI (See point 1 in screenshot below).
    • The PortNumber property should be set with the system number of the BI system. If you chose a system from the table above, the system number is given in the table. Otherwise, you can determine it with the SAP Logon UI (See point 2 in screenshot below).

 

    5. Save finally your changes and restart your server.

    

     fig_6.jpg


Create TCP/IP Connection in the BI system

Now that Galaxy registered a connection in the gateway, the BI system can connect to this connection. Actually, it connects to the gateway and specifies the ID of the connection (i.e. the program ID).

fig_7.jpg
  1. Run the transaction SM59 (transaction for managing the RFC connections to the BI system).
  2. Select the node TCP/IP Connection.
  3. Click on the Create button in the toolbar.

 

 

fig_8.jpg

 

 

  1. Enter first a name for the new RFC connection. You will need to enter this name when you will create the data source in the next step.
  2. Select the option Registered Server Program.
  3. Enter the program ID defined in the previous step.
  4. Enter the gateway hostname and the gateway service of the BI system. The gateway server is sapgw followed by the system number of the BI system. For example, the gateway service for Q76 is sagw29.
  5. Save your changes.
  6. Test your connection. If the connection cannot be successfully tested, ensure first that you restarted your NetWeaver system after configuring the SapJavaResourceAdapter15  resource adapter.

Create Source System

So that BI can access to the data sources and the data exposed by Galaxy, a source system needs to be created.

fig_9.jpg
  1. Start the transaction RSA1 (Data Warehouse Workbench).
  2. Select Source Systems.
  3. Select the node UD Connect, open the contextual menu with a right mouse click and click on Create.

 

 

fig_10.jpg

 

  1. Enter the name of the RFC connection you created in the previous step.
  2. Enter a logical name for the source system.
  3. Select SDK-JAVA in the pop-up.
  4. Enter a name for the source system.
  5. Enter BPM720 in the field Type and Release if you connected a Galaxy system from the 7.20 release to the BI system.


                The Type and Release field is used by Galaxy's BI connector to determine the structure of the data sources.


fig_11.jpg
  1. Select the data source you created and open the context menu with a right click.
  2. Click on Check. The result will be displayed in the status bar.

 

fig_12.jpg

 

 

 

 

 

 


     
    

How to setup Process Analytics? Part 2/3 - How to install BI initial content for Galaxy?

$
0
0

How to install BI initial content for Galaxy?

 

 

Prerequisites

 

Your Galaxy installation should be connected to a BI system. See How to connect Galaxy to BI document.

 

 

Install Initial Content

Once the source system has been created, data sources and other BI objects can be created. Galaxy offers a content package with pre-configured BI objects in order to use quickly the process analytics data in BI. To access these objects, the package should first be installed for your source system. The  screenshots below describe the steps to follow to install the package:

 

fig_1.jpg

 

  1. Start the transaction RSOR. If you are already in the Data Warehouse Workbench, just click on the BI Content tab in the left panel.
  2. Press the button to select your  source system. The following pop-up will be displayed:

fig_2.jpg

     

          1. Press the Deselect All button.
          2. Select your source system in the system.
          3. Press the Continue button.

     

     

       3. Select the Packages item in the BI Content menu.

       4. Select and expand the package with the technical name RS_BCT_GLX (the packages are sorted by technical names).

       5. Select the In Dataflow Before and Afterwards option in the Grouping menu.

         fig_3.jpg

         
        

       6. Select the Collect Automatically option in the Collection Mode menu. 

     

    fig_4.jpg

     

       7. Select the List option in the List menu.

        

          fig_5.jpg
        

     

      8. Select the two info areas and two applications in the RS_BCT_GLX package and drag-and-drop them into the right panel.

      9. Select the Install option in the Install menu.

         fig_6.jpg

         
        

    The installation of the BI content package can take a while. Once the package has been installed, a report is displayed in a new panel in the bottom of the window. You might see some errors about the installation of some data sources. Don't pay attention to them.


    fig_7.jpg

     

    Check Initial Content

     

    Once the installation of the BI content package has been done, you might want to check if the content has been correctly installed.

     

    fig_8.jpg

     

    1. Execute the transaction RSA1 to open the Data Warehouse Workbench or select the Modeling menu item in the left navigation panel.
    2. Click on Source Systems item.
    3. Double-click on the source system for which you installed the BI content.

     

    fig_9.jpg

     

     

    1. Press the button to hide the empty application components in order to display only the application components you just installed.
    2. Expand the different nodes to list all the data sources.

    How to setup Process Analytics? Part 3/3 - How to create manually data sources?

    $
    0
    0

    How to create manually data sources?

     

     

    Pre-requisite

     

    You need a configured source system. Read How to connect Galaxy to BI? to learn how to create a source system.

     

    Create an Application Component

     

    Open first the data sources tree for your source system. Start first the transactionRSA1, select Source Systems in the left menu and double click on your source system. You can also select directly DataSources in the left menu and then filter the datasources for your source system (use the Choose Source System button in the toolbar of the data source list panel).

     

    Then right click on DataSources for ... table header and select Create Application Component.

    fig_1.jpg

    fig_2.jpg

     

    1. Enter the name of the new application component. This name should be unique.
    2. Enter the description. Notice that the description is used as display name.
    3. Press the Continue button.

     

     

    Create a Data Source

     

    Select first the application component you created, and open the context menu with a right click. Select the menu item Create DataSource.


    fig_3.jpg


    fig_4.jpg

     

    1. Enter the name of your data source.
    2. Select Transaction Data in the drop-down list.
    3. Press the Continue button.

     

     

    The data source will be created and opened.


    fig_5.jpg

     

    1. Select the Extraction tab.
    2. Click on selection button. A pop-up will list all the available source objects. Select the desired one.

     

    Now the source object has been selected, the data source fields have to be defined and mapped to the ones of the source object.

     


    fig_6.jpg

     

    1. Select the Proposal tab.
    2. Press the Default System Proposal button. It will select all the fields listed in the table below.
    3. Makes sure that at least one field has been selected. Normally, all of them should be selected. If you do not want some fields, uncheck them.

     

    Press finally the Activate button in the toolbar. The data source will be finalized and activated (a data source can only be used after being activated).

     

    The message Field list no longer corresponds to proposal. Copy changes? might appear when you try to activate the data source. Press simply Yes.

    fig_7.jpg


    Viewing all 123 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>