Thursday, October 4, 2012

Using Bean Validaton in SOA - JEE 6 based sample application

This blog post is just a short notice to point to a JEE6 based sample, which was prepared for a JavaOne session "CON5020 - Using JSR 303, Bean Validation, with the Common Data Model in SOA".
JavaOne 2012 Parc 55

The sample, available at github, contains a list of NetBeans  (maven)  projects showing the design and the usage of bean validation constraints.

One interesting aspect might be also the usage of a common data model on a service provider and consumer side. It allows to reuse bean validation constrains together wit a model on a service provider and consumer side.

Readers of this blog might already got a notice about  Oracle ADF Essentials -  JEE end-to-end framework - free to develop and deploy.

The sample project is going to be used in a following blog posts to extend a functionality on a service consumer side ( view and controller) by leveraging ADF task flows, data controls and ADF Faces - features (actually a part of them - there is more)  provided by ADF Essentials.

In case you are interested in building ADF applications on a top of heterogeneous data sources - stay tuned:)

Friday, August 24, 2012

Designing Modular Applications in Oracle ADF - Follow up

This blog post is a follow up to the previous one about modularity in Oracle ADF.
Sample applications provided in a blog posts are simplified focusing on one or few aspects described.  

Real world ADF applications and task flows can be more sophisticated:


It is a diagram (some-task-flow-defintion.xml opened in JDeveloper) of one ADF bounded task flow currently running in some production system  - probably the largest ADF bounded task flow ever built :) 
 
This example aimed to show, that it is important to focus on modularity and reusability aspects from a very beginning, because it might be to late when a system already evolved to some size and complexity.

Just for the statistics: elements and their occurrences used in this task flow:

adfc-config 1
class 180
control-flow-case 184
control-flow-rule 124
data-control-scope 1
default-activity 1
display-name 30
document 30
exception-handler 1
finalizer 1
fixed-outcome 92
from-activity-id 124
from-outcome 90
id 30
input-parameter 81
input-parameter-definition 4
isolated 1
managed-bean 7
managed-bean-class 7
managed-bean-name 7
managed-bean-scope 7
method 93
method-call 93
name 173
outcome 96
page 2
parameter 172
return-value 81
return-value-definition 4
task-flow-call 29
task-flow-definition 1
task-flow-reference 29
task-flow-return 3
template-reference 1
to-activity-id 184
to-string 1
train 1
train-stop 30
use-page-fragments 1
value 342
view 2
And XSL transformation (described here) used to count elements in some-task-flow-definition.xml:

Thursday, August 23, 2012

Designing Modular Applications By Using Dynamic Task Flow Call in Oracle ADF

You might already asked yourself a question like this: How to design an application in a modular way? The principle of "Divide and conquer" helps us to get a complexity under control by breaking possibly complex system into a smaller manageable pieces a.k.a. modules.

This blog post shows how the features of ADF Framework like Dynamic Task Flow Calls, ADF Libraries together with JSF2 Bean annotations can be combined to achieve modularity on a View-Controller side.

The sample Application, available to download and run in JDeveloper R2, includes two projects: ADFDynamicTaskFlowSample and UtilityTaskFlow.
There is one bounded flow in each project:
main-task-flow-definition.xml and utility-task-flow-definition.xml

Loose coupling at design time


The Project UtilityTaskFlow is aimed to simulate a module encapsulating some piece of functionality. The module is  packaged as  jar and is being consumed (as jar on a class path) in a project  ADFDynamicTaskFlowSample like this:

There is no "hard" reference to our module in a consuming task flow main-task-flow-definition.xml at design time:
Notice the message in JDeveloper complaining that a reference utilityFlowReference is unknown. Actually the message states that almost everything regarding a module is "unknown". It should be ok -  we know, that the implementation will be available at runtime - we put it on a classpath :)


The consuming task flow main-task-flow-definition.xml  consists of two page fragments ( main1 and main2) and one taskFlowCall activity (utilityFlowCall).  Our module gets activated on a transition between two pages.

The sample use case is very simple:

  1. Some value entered on a page fragment main1 gets passed to a utility task flow (module).
  2. The value gets modified in a utility task flow (utilityFlowCall)
  3. The result of modification is displayed on a main2 page.
Let's start the  ADFDynamicTaskFlowSample (run home.jsf)  to see it in action.

Loose coupling in action


The first page fragment (main1.jsff) shows up.
We enter some value, like 'ADF' in this picture, and click on Next. Utility task flow gets activated:
We simulate some utility functionality by modifying the text a little bit to fit a scope of this blog post:
We click on Next - a utility flow is finished - it returns a result of our efforts to the consuming flow. The result gets displayed on a page fragment main2.jsff:
I think it should be enough to get the idea. It's time to have a look at the details now.

Defining the interface 


A quote from a definition of loose coupling: "In computing and systems design a loosely coupled system is one in which each of its components has, or makes use of, little or no knowledge of the definitions of other separate components".

What kind of little knowledge is needed in our case?

The Dynamic Task Flow Call activity in ADF requires to provide a Dynamic Task Flow Reference:

Therefore, the expression #{utilityFlowReference.flowId} is a first part of our conceptual interface: we defined, that a module being consumed must be a bounded ADF Task Flow, must identify itself by a name utilityFlowReference and must provide the reference to some implementing task flow as TaskFlowId.

The second part of our interface is defined in a section Parameters of task flow diagramm:
Our components should agree also about an input and return parameters and values in order to be able to  collaborate with each other a meaningful way.
In the second part of our conceptual interface we define that our utility flow should have on input parameter, named inputParameter and one return value, named returnValue.

Definition of our conceptual interface is finished.

Note: There is also an implicit agreement in place - about the types used for input and return, in our case it is String.  Consider using a composition by defining custom types for input and output messages to accumulate the data specific to your application domain. 

Implementing the interface


The conceptual interface is implemented in a project UtilityTaskFlow:
Essential  parts (actually all parts) are the class UtilityTaskFlowREference.java and a bounded task flow utility-task-flow-definition.xml.

A complete source code for the class shows implementation details of the first part of our conceptual interface (the name utilityFlowReference  and task flow Id)
And the section Parameters of the bounded task flow utility-task-flow-definition provides the second part of it (one input parameter, named inputParameter and one return value, named returnValue):

The utility flow looks like this (one page fragment utility.jsff and two task-flow-return actions):

Finaly a fragment of utility1.jsff for the sake of completeness: A value of input parameter is bound to inputText and is copied to return on a button click.

Note: please refer to a broad range of resources and tutorials available in case you just starting with ADF  and particularly with ADF Task Flows.

Packaging the module


The Project UtilityTaskFlow is packaged in JDeveloper as ADF Library jar like this:
 Create Deployment Profile in JDeveloper:

Deploy to ADF Library Jar file:
We get utilityTaskFlowADFLib.jar as a result (in less than one second :) containing our module:

Our module is finished and ready to be used where needed.

Conclusion


At this blog post we looked at a modularity aspect of  ADF application design.

ADF Task Flows, Dynamic Task Flow Call  together with JDeveloper ADF Library packaging facility facilitate the design an implementation of  modular enterprise applications.

The interface used to stick our modules together is a "virtual" one: it consists of a few design decisions and naming conventions.  Modularity efforts for a java platform might result in a standard way of building a modular applications one day.

Feel free to take sources of this sample at github to find out how the concepts and techniques described and used in this blog post can fit to your needs now.

Saturday, July 28, 2012

Building a Simple Chat Application with Active Data Service and JSF2 AJAX in Oracle ADF

Data push with Active Data Services (ADS) in Oracle ADF has been available for some years.

AJAX functionality has got attention in JSF2 specification. JSF2 made its way into ADF with JDeveloper Release2.

This blog post provides a simple ADF active Chat application, built by using active data service functionality together with ajax tag of JSF2.

Blog readers might find the sample, available to download and run in JDeveloper R2, useful to become familiar with basic functionality of active data service. Various active data configuration options (transport modes, delays) could be evaluated for specific networking environment by modifying adf-config.xml.

Let's have a look at the building blocks of the sample application in JDeveloper first:


Controller: unbounded flow adfc-config.xml with 2 views chat.jsf and login.jsf, one bounded task flow chat-task-flow-definition.xml, embedded into the container page chat.jsf as a region (chatPageDef.xml page definition contains region binding as a result).

Model: 2 java interfaces Chat.java and ChatListener.java, describing our chat model.

Implementation:  JSF2 annotated beans ChatBean.java and ChatListenerBean.java.

UI: page login.jsf and page fragment chat.jsff.

Web Content and run time behavior of chat sample application


A default unbounded task-flow, named adfc-config.xml contains two view activities: login and chat:
We start the sample by executing adfc-config.xml  (pick a login activity in default run configuration panel) - the page login.jsf gets displayed in a browser:
In my case, the default browser is Firefox, so I pick a username Firefox first and press the button Go to Chat -  Action chat behind the button gets executed - a control flow case chat leads us to the page chat.jsf.
Chat with one user might be boring, so i start browser Chrome, enter the URL, pick a name Chrome and join the Firefox in ADF active chat.

A snapshot of this (simulated) conversation is displayed in a picture:
I leave the browsers to wait for IE to show up and continue with description of the sample.

Note: this sample application is deployed and (probably) running at this location - give it a try.

Usage of af:activeOutputText and JSF2 f:ajax tag


The sample leverages only one ADF active data element - af:activeOutputText tag - to initiate partial refresh for three "conventional" elements on a page: tables containing chat messages and users and the text Alive:true.  

Note: ADF UI Pattern of using af:activeOutputText in combination with af:clientListener, javascript function  (and eventualy af:serverListener in case some server side functionality is involved) is well described in various blog posts and presentations by Lucas Jellema - for example the recent one provides a sample of how changes in a database can be "pushed" through all the layers up into UI. 


Early description of active data service related techniques used in this sample  was provided  in a blog posts Flexible ADS – Combining popups with ActiveDataService and ADF’s Active Data Service and scalar data (like activeOutputText) by Matthias Wessendorf.


This pattern leverages propertyChanged event of af:activeOutputText - propertyChanged  gets fired when value of activeOutputText changes (by a data push from server) because a background color of the text gets changed to "blue" for a moment to indicate a value change for the user.

Let's have a look at the page fragment chat.jsff (essential parts only - layout tags were stripped) how the tag activeOutputText and its event propertyChanged are used in this sample in conjunction with f:ajax tag (available since JSF2):

The tag activeOutputText with a value bound to the "active" property message from a bean chatListenerBean emmits propertyChange event. ActiveOutputText is wrapped by f:ajax tag, which is "interested" in this event:
event="propertyChange" 

According to description of  f:ajax tag , the attribute render allows declaratively specify a list of components (as space delimited string of component identifiers) to be rendered on the client: render="t1 t2 ot1".

The combination of af:activeOutputText and f:ajax allows to refresh all non active components upon data push from a server declarative way. There is no need to write any JavaScript for this particular usage scenario.

Note: well... no need for any specific JavaScript. There are 2 "strange" tags without particular meaning on a page - see the comments like  "this tag does nothing"  and "anonymous javascript function onevent does nothing". 
Actually, they just have to be there  - nothing else. I didn't figure out exactly why  - otherwise ADS didn't work for me. 

Let's go to the model part of the sample.

Defining a model for the chat


Two simple interfaces define a model: Chat and ChatListener.

The idea expressed in Chat interface is to accept a login or logout of ChatListener, to provide a list of current users (getUsers) and messages (getMessages) and one method addMessage(String message) to broadcast a new message to the listeners:

The idea of ChatListener is to be able to identify itself by a user name, receive new messages (as propertyChanged events in this case) and to provide a possibility for the chat to check, if the listener is not gone (isAlive). Note:  The readers might advance the model and extend it for their needs in case of interest.

Implementing the model as JSF2 beans


The idea behind the implementation was to leverage a JSF managed beans and their scopes. Our chat is intended to be there for everyone, so ApplicationScope is a natural  fit for it.

ChatListener can come and go, so ViewScope was selected for that.

JSF2 annotations were used in a sample to define manage beans and their scopes and inject their property values.

There is also a reasonable amount of comments in the source files describing specific details.  

Developer's Guide for Oracle Application Development Framework - 45 Using the Active Data Service  provides a comprehensive description about Active Data Service and configuration options of it.

To finish this section I provide a complete source:
Source of ChatListenerBean (imports stripped, active data service part is based on the description in this blog.)
And the Source of ChatBean (imports and header stripped):

Conclusion

Server side data push can be considered a little bit challenging due to the nature of connectionless HTTP protocol. 

Oracle ADF provides a built-in implementation for the data push taking care about the challenging aspects of it. By using simple principles described or referenced in this blog post, "non-active" parts of UI could be "activated", like in the sample chat application. 

Ongoing efforts in HTML5 address this issue by providing a connection-oriented WebSocket. Once the standards behind HTML5  are widely accepted and implemented - the architecture of web applications will probably shift  (back)  to the client-server, bringing new(old) possibilities for developers of active applications.

Que Sera, Sera (Whatever Will Be, Will Be) - there is no need to wait for the future - the data push functionality provided by Oracle ADF is ready to use now :)

Thursday, July 5, 2012

Using contextual events to refresh Bean Data Control iterator in Oracle ADF

This blog post addresses one particular issue related to ADF Bean Data Control: how to refresh.  The readers which are not yet  familiar with Bean DC might refer for example to the tutorial Using Bean Data Control or this video Working with the ADF Java Bean Data Control by Shay Shmeltzer to get an overview about a usage of it.

Just to remember: Bean Data Controls allow to create data bound pages based on data provided by plain java objects.  ADF Data Controls provide an access to data through binding layer a unified declarative way regardless of implementation details.

One of built-in aspects of binding layer is caching: you might already noticed one property of Iterator (visible in Property panel when Iterator is selected in a page definition): Cache results: <default> true

It is an expected behavior in most of the cases which results in performance gains and is provided by ADF framework by default without an additional development efforts.

In case the data behind a Bean Data Control  is modified in java directly an additional development step is needed to refresh an appropriate iterator and get the result of modifications visualized in UI component based on the iterator.  

This blog post provides an example of this issue and shows one simple technique of "how to refresh without side effects" -  not loosing some other functionality of the framework, like current row -  for Bean Data Control. 

The sample application, available to download and run in it JDeveloper R2, is simple in memory implementation of CRUD+ use case for ADF table based on  Bean Data Control and java collection of domain objects (in our case collection of Persons).
The symbol + stands for extension of CRUD with "copy Person" functionality in order a) to create new Person based on selected one in a table and b) to show how to refresh an iterator (personListIterator) after a new Object (Person)  is added to a collection in java.
Lets take a look at the sample application in order to better understand the issue with refresh first.


Short description of CRUD+ sample - using standard functionality of ADF Bean Data Control

There is one Bounded Task Flow, named persons-task-flow-definition, containing one page fragment with one ADF table in it: 
The table is based on iterator personList from flowController and CRUDpersonsTaskFlowDataControl (see previous blog post with a description of  Flow Scoped Data Control  for more details regarding this technique):
The buttons Create and Delete are based on standard ADF Operations (created by Drag&Drop) - nothing exciting, it just works. The button Copy selected person was created by D&D of copyPerson method from data control into a page and providing a currentRow.dataProvider  of personList iterator.


An appropriate java class behind a data control , named CRUDPersonsFlowController , contains a collection personList and implementation of method copyPerson:


To complete a brief description of the sample,  a snippet of domain object used , Person.java, looks like this:

And the page definition at this stage contains bindings created implicitly as a result of our Drag&Drop declarative efforts:

Note:  recent blog post of Depac C S shows an alternative ADF java API based way of how to create table rows based on existing row content.


The issue with "Refresh" related to personListIterator


Our sample behaves almost like expected at this time: one click on create button and new empty row is created, by issuing a click on delete we delete a selected person from a list.

The refresh issue shows up on using a functionality of copyPerson: we click on a button Copy Selected Person,   method copyPerson in flow controller java class  gets executed, we get the person selected as input in a java method, create new instance of domain object Person using copy constructor add the instance created into personList.

Expected result: the table contains one person more - a clone of a selected person.

Actual result:  nothing.  Everything looks the same like before issuing a copyPerson action. 


The table on a page is not refreshed automatically after doing a modification of data behind it in java. It shows old  (cached) data of personList. 

We need to take care about "refresh" by issuing an execute on personList iterator to align the contents of collection represented in java and cached also as peronListIterator previous execution result.

Using contextual events to refresh personList iterator

In order to fix the issue, we go to the page definition of editPersonList.jsff and create a new Execute binding for action on personListIterator first: open editPersonList.jsff, click on a + in a panel bindings , select the item to be created: action, ok and set the properties Select an iterator and Operation like this:
We get an Execute action binding as a result of this effort:
Now we just only have to issue this Execute action after the execution of copyPerson in order to "renew" the content of personListIterator.

We can achieve that by using build-in contextual events of ADF in two simple steps:

  1. Issuing a contextual event copyPersonEvent upon execution of copyPerson action and 
  2. Subscribing an Execute action of personListItertor as listener of it. 

Creating copyPersonEvent contextual event on copyPerson action binding


Select copyPerson in a Panel Structure of editPersonListPageDef.xml , Right-click - Insert Inside copyPerson -> events:
An element Events is created. Select it and create a new Event by issuing a right-click ->  Insert inside events - events:

Provide a name for the event in a panel:
The Events section in a Structure Panel looks after that like this:

The event copyPersonEvent gets fired once  copyPerson action is executed. There are no subscribers for the event at this stage. 
Lets subscribe an action Execute of personListIterator to listen to the event to renew the iterator content.

Subscribing action binding Execute of personListIterator to contextual event


We go back to Page Data Binding Definition, select a tab Contextual events, then select copyPersonEvent (the event was created in a previous step), and select subscribers tab on a right side:
Click on a + symbol to create a new subscription. In a panel  subscribe to Contextual Event pick or enter the properties Event, Publisher and Handler like this:
A new Element Event Map gets created as a result , containing a list of subscribers to contextual events. 
We can see Event Map in a Structure panel now: 

Bonus - using copyPersonEvent  to select a new row containing the person created by copyPerson action

As a bonus for this blog post -  additional action binding for the built-in operation Last of personListIterator was created and subscribed to the same event. 
It shows that more than one subscriber to the event can be defined and provides one use case of this feature: to position a current row on  a person copied in table on a page. The steps already  described in previous chapters were used.

Behavior of our sample application

We start the sample application and create a new row first by a click on Create button. Lets name the person created Mister Smith:
Click on Submit. New instance of Person object gets added to the collection of persons. 

Now we clone Mr. Smith by a click on Copy selected person 
(Remember The Matrix? There were a lot of them at the end :) 

Our Mister Smith is cloned and displayed at the end of the table (execution result of copyPerson, fire  copyPersonEvent , Execute on personListIterator as subscriber for copyPersonEvent). 

And the bonus -cloned Mister Smith is selected in a table as currentRow ( execution result of Last action  on personListIterator as subscriber to copyPersonEvent):
Conclusion
  • The sample application shows simple in memory implementation of CRUD+ functionality by using a built-in standard functionality of ADF Bean Data Control declarative way
  • The issue with "refresh" was compensated by using built-in ADF framework functionality of contextual events declarative way
  • The plain java code of  flow controller is supposed to facilitate an easy reading , testing out of context (no dependencies to additional APIs) and understanding of it - it is an important aspect of every enterprise system regarding (usually) a long lifecycle of them.
The sources of the sample application are provided at github

Check also ADF EMG Samples  page for additional sample applications covering broad range of ADF.

Sunday, June 17, 2012

Using JDeveloper Maven integration to run Oracle ADF sample in Java cloud

Readers of this blog might already took a notice about one link on a right side, pointing to experimental page with live ADF sample application embedded on it. It looks like I'm not alone to have the idea of live ADF samples in my head. Zeeshan Baig, featured in recent Oracle JDeveloper OTN Harvest, shares the same wish too:  
 "Free cloud instance for ADF like apex.oracle.com which is available for APEX".  
 And fellow Oracle colleague Steven Develaar seems to get no sleep dreaming about live samples:
As a good sleep at night is one of important aspects of life in order to be able to deal with challenges during a day,  this blog post is aimed to make small step towards ADF Live samples and increase an average level of happiness in the community around the framework.

The sample application of this blog post is kind of "Hello World" for ADF and Bounded Task Flow, so the main focus is on using JDeveloper Maven integration (available since R2)  to create, package and run it.

The sample leverages a subset of the framework: ADF Faces components (view) and Task Flows (controller). As the cloud platform in this case is actually based on Tomcat 6, so the steps used to get it running on Cloud Bees java platform could be useful in case someone wants to run it on a Tomcat locally. Important aspect: the platform is not certified and supported to run ADF, so please don't expect to much, especially to get a complete ADF stack up and running on it without efforts.

Oracle Public Cloud was announced (again) last week - if you are in a position of having an account, so there shouldn't be a challenge to deploy  and run full ADF application stack on it - as stated here it is ready to run ADF stack.

Being not in the position yet i proceed with description of my sample.
The application,  available to download and run it in JDeveloper R2, should look upon a start like this:

Actually, if you see something like "ADF Task flow running..." in the area above this text - it is the sample application running in java cloud.

UPDATE: Recent power failure and associated infrastructure failures in Amazon US-EAST-1 impacted several high profile services  and also this ADF sample application -  the application was restarted again. 

You can click, for example, on a button next to become familiar with functionality of it. It is embedded in this blog post:
There is one page home.jsf with one bounded Task Flow named sample-task-flow, embedded as a region on it. Application scoped managed bean CounterBean.java is used to provide "rich" functionality of the sample: to display start date, and to count page views of two page fragments in a bounded task flow.

Lets go trough the steps used to create, package and deploy it.

Using JDeveloper R2 + Maven integration to create the sample ADF app for java cloud (in 5 minutes :)

We start in JDeveloper  with New -> Application -> Maven Application:
Step 1 of the wizard looks like this:
 Provide a name for the project on Step 2:
 Step 3: change packaging to war, other settings are default:
Finish. JDeveloper creates maven project structure for us together with pom.xml.

Note: Once you create a Maven application and project, JDeveloper also takes care about keeping Libraries and Classpath in Project Properties and dependencies in pom.xml in sync - very nice and helpful.

 Our project is quite new at this stage  - there are no dependencies at the beginning.

We just tailor one setting for now: go to Project Properties -> ADFm Sources and change ADFm Source Directory to point to ...src\main\resources like this:
It saves some manual fixing later, because once we start to use some specific features of ADF,  its metadata, like page definitions,  is going to be generated here. Otherwise we get ADF metadata created outside default resource path.

The other JDeveloper project properties were already set by the wizard:
  • Project Source Path points to ...src\main\java
  • Output directory points to ..\target\classes 
  • Project-> Resources look like this:

The steps used to create web content (fast forward)

Create ADF bounded Task-Flow in JDeveloper following default settings all the way: New->JSF/Facelets: ADF TaskFlow: ok. Task flow named task-flow-definition.xml is created.
Open it, drag&drop  two view components , provide two control flow cases and names for them:
JSF page, named home.jsf, was created after that and the task flow was dropped as Region on it .
Java class CounterBean.java was created, JSF2 annotations @ManagedBean and @ApplicationScoped were used to put it into the scope of this sample. ADF panelLabelAndMessage  and outputText   components were used on UI pages of it. The picture of our sample Project in JDeveloper provides an overview of the sample.

Maven pom and Jdeveloper project libraries - what happens in a background

As already mentioned, JDeveloper takes care about Project libraries and Maven pom dependencies for us. Once some application component gets created, the required libraries are automatically added to the project settings and maven dependencies:

JDeveloper project libraries at this stage:
And Maven dependencies in pom.xml:
The same libraries and dependencies are in the list.

Few tweaks to get it up and running on java cloud

You might already guessed - there should be something more. Of course. Tomcat based java cloud platform is not certified -  some manual "tweaks" are necessary to get the sample up and running on it.
  • Add 2 JRF libraries as maven dependencies in JDeveloper-> maven settings to pom (Tomcat doesn't provide JRF Runtime libs  -  we need to provide them together with our application):
  • Add one context param to  web.xml - to switch pretty URLs off :

  • The sample project contains one CloudBees platform specific  deployment descriptor: WEB-INF\cloudbees-web.xmlThe descriptor  provides a name (or application id) of bees application created  for this sample-  in this case <appid>mavenproj</appid>. 
  • Only for embedding ADF application as iframe pointing to different domain -  frame busting context param was set to never in web.xml to prevent the application "pop-out" from this blog:

  • Trinidad jars trinidad-api.jar and trinidad-impl.jar were copied manually from JDeveloper library locations into application directory WEB-INF\libs to overcome some lib compatibility issue.

We can use Maven package goal to package it as war, for example direct in JDeveloper like this:

And the log output in case of normal execution shows a successful build and one webapp packaged as JavaCloudMavenSample-1.0-SNAPSHOT.war:

That's all  -  the section how  to create ADF sample app for java cloud in 5 minutes is finished.

Our application is ready - its time to test and deploy it.

Deployment to CloudBees java platform
GettingStarted provides few ways how deploy java application to the platform (called RUN@Cloud). One of them is to download SDK and use CLI  (command line interface) tool.

Note: the CLI is normally cloud service vendor specific - for this platform the command line utility is called bees.

Once SDK is downloaded and configured -  it can be used to create and deploy applications on the platform.

You have also to sign up to the platform and subscribe for run service - like always before using some service on the Internet. There is also a free subsciption to start, limited in resources,  but enough to run the sample of this blog:

Using bees console (CLI),  test the application , packaged as war, locally first:

>bees app:run target/JavaCloudMavenSample-1.0-SNAPSHOT.war

Note: one error message (besides some other messages) is displayed: oracle.jrf.UnknownPlatformException: JRF is unable to determine the current application server platform. 
It is ok - the platform is not certified,   oracle JRF doesn't know it and the Exception states exactly this.

Once the sample is running and responding locally (http://localhost:8080/faces/home.jsf) , it can be deployed to java cloud platform by issuing a command:


>bees app:deploy target/JavaCloudMavenSample-1.0-SNAPSHOT.war :






Conclusion

Following description in this blog post we created a simple ADF Application leveraging View (ADF Faces) and Controller (Task Flows) part of the stack. We used JDeveloper Maven integration to set up Maven project structure and to package it.

And finally we followed the principle of java "write once - run everywhere" and got our ADF application (ADF is java standards based framework facilitating an easy usage of it declarative way) running  "everywhere" - locally and in a public java cloud.

The sources of this sample are available at github.