How To Setup An IDM Project

Skip to end of metadata
Go to start of metadata


For the technical setup of an Identity Manager project a lot of requirements are to be considered. Some of them are based on the fact that the Identity Manager is a web based application and therefore requires a Java Enterprise architecture. The installation of such an application is totally different from the installation a mere mortal is accustomed too.

Other requirements are bound to the development and test cycles targeted for different environments. In this context an environment is an isolated setup of all of the necessary pieces, including the Identity Manager itself and all the systems, databases, directories that have to be integrated. A huge challenge with this is the availability of all this targets. Very often a particular system (e.g. Active Directory) is only available once and can't be cloned that easy.
Very often there is more than one developer involved in a project. They must work in a coordinated collaborative manner. Changes and enhancements made must be available to all of them, regardless where they are located or where they work from.
The following article describes what should be taken into account while setting up a development environment for an Identity Manager project.
It begins with a short wrap-up about how the Sun Identity Manager is architectured. Everybody who is already acustomed to this, can simply skip this paragraph and jump directly to the description of the different artifacts or to the sketch out of the development cycle. The documented approach has been used in several customer projects and is considered to be a blueprint. At the end of the document you find a chapter that gives a detailed checklist on what to do for setting up a project. There are different chapters for doing that on OpenSolaris and Redhat Enterprise Linux utilizing subversion and the Sun Application Server. Everything is installed into a Vmware Image that can be downloaded and used as a starting point. The selection of components is simply bound to a specific project. The processes described are general and can be implemented using any product.

Architecture of the IDM

The Identity Manager is an integrated solution for user provisioning and synchronization in heterogeneous environments. Sun Identity Manager covers the whole life cycle from creation to termination of an user account. The architecture is very open and extendable. For integration of the different resources it utilizes the public API's of the products. Usually there is no need to install any additional component on the systems which will be connected.
Identity Manager consists of a kernel written in Java and a bunch of servlets and JSP's that are forming the user interface. The main component of the kernel is an XML interpreter, as everything is described in form of XML objects. Besides that, the kernel is basically a workflow and rendering engine.
As already pointed out the whole system is based on XML objects of different types. These objects are stored in a database the so called repository. When working with the system these XML Objects are read, interpreted and used according to their type. If the type indicates a form, an HTML page is rendered by the rendering engine. If a workflow is indicated, the object will be handed over to the workflow engine.
The servlets are mainly used to form the user interface. Furthermore they can be seen as the glueing code for interaction with the underlying kernel.


A system for user provisioning only makes sense, if the managed data can be read and written from whatever system is involved. In the context of the Identity Manager these systems are called Resources. Resources come in different flavors. They can be as simple as a text file spanning over directory servers up to full fledged enterprise resource planning systems. The requirements of every system are of course different, but inside Identity Manager they are hidden beneath a unique abstraction layer. The most prominent resources are Active Directory, SAP HR, SAP R3 and LDAP. A list of supported resources can be found in the IDM Resource Reference.

For connecting this resources to the system so called drivers or adapters are needed. These are software packages of Java classes that implement everything needed to talk to the resource. Usually they are available as so called jar files (Java Archive). It's a simple archive containing all the necessary files. In most cases they are developed by the vendor of the resource and are available for download on the vendor's website. The mysql driver is by example available here.

Web Application

As a web application the Identity Manager needs a runtime container. This container implements several well defined public API's. Utilizing these API's the applications have access to services like database or directory access, authentication, authorization and session services. Java based application server are available in two different flavors. The full fledged J2EE application servers and the light weight servlet containers. A short introduction into the J2EE architecture can be found here.

A servlet container basically implements the Java Servlet and Java Server Pages API. A J2EE server in contrast must implement every aspect of the J2EE API. Therefore it incorporates a servlet container, but supplies additional services to the applications. Tomcat from the Apache Software Foundation is the reference implementation of a servlet container. Glassfish developed by Sun Microsystems is the reference implementation of a J2EE Application Server.

The Identity Manager relies on a servlet container and does not need any service of a real application server. Therefore a servlet container like Tomcat is sufficient. Nevertheless the usage of a full blown Application Server like glassfish come with advantages in handling and monitoring.
The supported container can be looked up in the document IDM_Installation and the Release Notes.

File Structure

A web application consists of HTML pages, images, JSP Files, servlets and Java Code. The Java Code very often is organized into JAR files (Java ARchive) - the Java Version of a library. The files are stored in a hierarchical directory structure and simply archived in a single file. All files together form the web application. To ease the handling everything is bundled into a single WAR file (Web Application Archive). Again this is a simple archive file forming the application. Servlet and application server know how to deal with files like that.
The directory structure of the Identity Manager web application is layed out as followed. Note that it is slightly shortened.

  jsp directories

The file located in the <IDM-Root>/config directory contains the base config of the system. This file was historically read only during startup. Modifications on this file therefore needed a restart of the application server or at least the application server container. Current versions are capable of re-reading even this information while running.

Java Code

The system kernel is stored as several jar files in the directory <IDM-Root>/WEB-INF/lib. Besides the archives forming the core IDM application there are others that are needed by the core, but are developed by 3 rd party. Additional drivers for accessing the different types of repositories also have to be stored in this directory. One example is the file mysql-connector-java-5.0.5-bin.jar, which is the jdbc driver for accessing the MySQL database.

Depending on the application server of choice this is also the directory to supply the activation and mail framework. Simply store the files activation.jar and mail.jar in this directory. See IDM_Installation for further information. If you ever want to develop your own Java code, it can be stored in the same directory bundled as a jar file. Alternatively your classes can be placed in the directory <IDM-Root>/WEB-INF/classes without being organized into a jar file. This is an option if you just have a few classes and the overhead of organizing them into a jar isn't worth the effort. If the IDM IDE is being used, the compiled Java classes are by default placed into this directory. A javadoc based documentation of the Identity Manager Java classes can be found in the REF Kit of the distribution package. If unzipped the documentation is located in the directory javadoc. Note that the REF Kit is not installed by default.


The IDM root and several sub directories are holding the IDM JSP files (Java Server Pages). The files forming the enduser interface are by example located in <IDM-Root>/user.
Customizations of the enduser interface are usually realized by modifying the CSS files in <IDM-Root>/styles Most of the time the JSP files don't need to be touched.
Still, if the flexibility of the CSS files are not sufficient, the jsp files have to be changed. Unfortunately simply renaming the files that have been modified is not possible. In various files the name of the jsp files to be called are hardcoded. If you start modifying file names all referencing files have to be touched.

XML Files

As already explained the Identity Manager is basically a set of Java code forming a run time system, consisting of different engines and a large number of helper classes. The whole configuration and every other aspect of the Identity Manager is defined utilizing XML Objects. There are for example User Objects or Resource Objects. These files are interpreted by the engines of the Identity Manager, remodeled into Java objects and used to build the system.
For every resource that should be connected Identity Manager needs code to talk to the system, that are the Java based adapter classes mostly in the form of Jar files. Besides the implementation a system to be connected has to be configured. This is done by defining an XML object of type ResourceType. The attributes of this resource type just describe more aspects of the system. Examples are the name or the proxy user credentials for accessing the system. The main information is basically the class name of the java classes that build the driver. Given this information the Identity Manager knows, which code to call, if a component wants to access the system. This behavior is called Late Binding and bridges the gap between the XML objects and the Java code. Besides resource definitions the XML objects define the user forms (HTML pages), workflows, even user and much more. These XML objects are stored in the IDM Repository. Modifications and enhancements are done by altering these objects in the repository. The initial state of every object in the repository can be rebuild by cleaning the repository and reimport every single XML file from <IDM-Root>/samples. Besides the initial seeding of the repository this directory is a valuable source for examples of how to work with the different aspects of Identity Manager. Simply copy and modify the files to your needs. After reimporting the file into the repository your modifications are in place.

More information and short introduction can be found in IDM_Administration.


The Identity Manager repository is the single location, where all the XML artifacts are stored. Additionally all log information is stored in it as well. If a workflow has to be suspended, because an approval is pending, it is persisted in the repository. The repository is the single most important part of the Identity Manager, if it is not available, the Identity Manager can't start up. To access the repository a JDBC driver is needed, the same technology which is used for accessing every other resource database. The supported repository databases are listed in IDM_Installation and the Release Notes. Note that not all databases that can be used as a resource are also supported as repository.
The Identity Manager must know, what repository to access while starting. This information is read from the configuration file <IdmRoot>/WEB-INF/ServerRepository.xml. This file is created using the lh setRepo command. See IDM_Administration for more details.
The size of the XML objects needed to form the Identity Manager, is not that big, just several 10 MB. On the other hand if your Identity Manager is expected to be high available, the database also must be high available. Simply put, without the repository there is no Identity Manager. Therefore most installations are using cluster technology like Sun Cluster to make the database platform as available as possible.
Concerning the Identity Manager it is questionable if it must be high available at all. All applications continue to work even if the Identity Manager is not available. If on the other hand the Identity Manager forms your user self service, it is a good idea to increase the end user experience and to have this page as available as possible.
The Identity Manager functions like any other web application, therefore a load balancer is used to establish high availability. The load balancer supplies a virtual endpoint for the service and balances user requests to as many Identity Managers as needed. But keep in mind, that all instances of the Identity Manager have to rely on the same repository database.


I won't discuss the Security aspects in detail here as there is a very good chapter located in the IDM Administration Manual already.
But there are some aspects that are directly related to IDM projects that had to be discussed a bit.

Identity Manager is using so called Server Encryption Keys to securely store sensitive data in its repository. To date,
this includes:

  • user passwords
  • user previous passwords (history)
  • user answers
  • resource passwords

The Server Encryption Keys are symmetric triple-DES 168 bit keys. There are two types of keys supported by the server. The first is the default key which is compiled into the server code. The second is a randomly generated key that is generated while populating the repository.

Keep in mind that every time you delete and re-initialize the repository, a new and different key is generated.

The Server Encryption Keys are objects maintained in the repository. The number of encryption keys in a given repository is not limited.
Identity Manager is using the current default Server Encryption Key to encrypt each piece of sensitive data to be stored into the repository. The encrypted data is prefixed by the ID of the encryption key that was used to encrypt it. When an object containing encrypted data is read into memory, the encryption key associated with the ID prefix is used on decrypt it.
It is therefore very important that any Server Encryption Key referenced by some object's encrypted data has to be available in the repository; otherwise, the server will not be able to decrypt it.
If an object containing encrypted data is imported from another repository or IDE, then the associated server encryption key must first be imported to ensure the object can be successfully imported.
That said, after the deployment of IDM one of the first actions should be to export the server keys and check them into the version control system. If there are multiple developer that might create encrypted data (see above for types), all of them should export the Server Encryption Keys of their environment and check it into the version control system.
To avoid these multi-key issues, as well as to maintain a higher level of data integrity, use the Manage Server Encryption Task to re-encrypt all existing encrypted data with the "current" server encryption key.
This Task allows an authorized security administrator to do several key management tasks including generate a new "current" server key and/or re-encrypt existing objects, by type, containing encrypted data with the "current" server key. See the Identity Manager Administration Guide for more information on how to use this task.
After re-encrypting everything with a newly created key, feel free to remove all keys but the newly generated one.

To generate XML objects that include encrypted data like Resource Adapter definitions or administrative accounts, I usually use the Identity Manager wizards to create the objects and export them afterwards. Therefore the data is correctly encrypted but tied to the current default Server Encryption Key of the environment the objects are created in.
If I don't want to have this dependency I can replace the password attribute with the asciipassword attribute and supply the password in cleartext. Which is obviously not a good idea in a production environment.

For resource definitions it is acceptable to simply remove the content of the password attribute and to have the administrator to login and set the password for any single resource.

Identity Manager projects

Identity Manager is coming with a lot of Java classes for dealing with internal structures and objects. On top of this there are a lot of helper or utility classes, that are proven to be useful. An enhancement by self developed Java classes is often not necessary, but possible of course. In fact it is a good idea to check if a particular problem isn't already solved by use of one of the existing classes or methods.
Identity Manager is coming with it's own build in programming language called XPRESS. It's meant to be used as a sort of glueing code. It's targeted for implementing smaller requirements. XPRESS is an XML based language and optimized for strings and lists, but not limited to them alone. If preferred, even larger problems can be addressed by use of XPRESS. The main usage is for implementing transformation methods and workflows. See the documentation for further information.

The ingredients of an Identity Manager projects are:

  • Resource definitions,
  • Configuration objects,
  • Form definitions,
  • JSP's
  • Images,
  • Workflow definitions,
  • Functions and/or function libraries,
  • Java classes (if necessary).

Everything apart from the Java classes are XML objects and are stored in the repository. The Java classes are stored in the application filesystem and packaged into the war file. Therefore every Identity Manager project consists of two components: the WAR file and an XML import file for the repository.

Creation of deployment packages

The final application consists of the initial product as delivered to the customer and the project specific definitions, modifications and enhancements. The final package is build by merging these pieces together. First, the delivery package has to be extracted into a so called staging directory.
Afterwards all project specific components (e.g. HTML pages, JSP files and Java class files) are copied to the appropriate location in the staging directory and sub hierarchy . Then the staging directory is repackaged into a WAR file (Web Application ARchive).
This file is now the program to be installed on any application server or servlet container.


The term deployment describes the installation of the WAR file into the run time container. Very often the file is installed using tools of the servlet container of your choice. Others need the file simply extracted at a special directory, monitored by the container. This feature is usually called auto deployment.
The reason for using a special tool is usually, because modifications or additions to one or more configuration files are necessary. The tools just copy the WAR file to the correct location and extract the content, if needed.
Depending on the servlet container a restart of the server or the container can be necessary. Now the application can be accessed using the deployment URL. This URL is constructed by using the contextroot supplied at deployment time and the containers base URL. Assuming that the contextroot was choosen to be /idm the deployment URL would be http://<Servername>:<ServerPort>/idm.
As already described there is a second part necessary for a complete installation of Identity Manager. The repository must be populated with the XML objects. This is a two phased job. First the default XML objects have to be imported, then the update file as build by the IDM IDE has to be incorporated. This file doesn't hold the XML objects by itself, it is just a wrapper including the different XML files. See IDM_Installation for further guidance on how to generate the initial product WAR file. For populating the repository simple import the <IdmRoot>/sample/init.xml file. Keep in mind, that the project specific XML objects overwrite existing objects of the same name or Id.

It's best practice to not alter the original XML objects. Instead copy the file containing the XML object of your needs to a project specific file, then change the name and id to a project specific value.

Another thing to remember is that XML objects can be referenced by other XML objects using their names or ids. Therefore in some cases a simple rename might not be sufficient. If references exists, these references has to be changed accordingly.
Apart from full deployments there are incremental deployments. If only modifications of XML objects were made, the only thing to do is to import the XML update file as build by the IDM IDE. If Java classes or JSP files have been modifed, the WAR file must be build and deployed again.
A full deployment runs like this:

  • Creation of the deployment archive build from the product delivery bits, the necessary drivers, any additional Java classes and modified JSP files (if any)
  • Creation of the repository database tables with the creation scripts from the samples directory
  • Import of the product delivery XML objects
  • Adaption of the project specific XML objects to the target environment (change host name, login credentials, etc. according to your targeted environment)
  • Import of the project specific XML objects
  • Deployment of the created archive.
Development process

Given the process as described above, a manual installation of Identity Manager is a complex and error prone process. Historically the Business Process Editor (BPE) has been used to modify XML objects. The BPE acted directly on the objects stored in the repository. The objects were read, interactively edited and written back. As the modifications weren't tracked or stored somewhere in the filesystem it was very hard to track modifications in a consistent manner.
The only way to generate consistent, reproducible project packages was to store local copies of new or modified XML objects and use the custom build environment (CBE), a tool that was build in the field for use in projects. As the file export wasn't enforced it was usually forgotten and the content of the repository and the files on disk were not identical.
The introduction of the IDM IDE with Identity Manager 6.0 addressed these aspects of the development process. It is developed as a plugin for the Netbeans development tool. As the IDE already addresses the default requirements of a developer including support for version control systems, the plugin can concentrate on aspects that are Identity Manager specific. It still relies on the CBE, but completely covers the process the developer to deal with. You can still access the repository directly with the IDM IDE and work on XML objects, but usually this should not been done.

Version control

With the introduction of IDM IDE the development process of an Identity Manager project is nearly identical to every other distributed software development project. The next issue to address is the subject of versioning and controlled access to the source files. This issue is addressed by use of a version control system. These systems track every modification with a time stamp and the authorship. Ideally every modification is documented using the check-in message, but this is often forgotten.
Even if developing in larger teams, it is now possible to ensure, that every developer is working with the most current versions of all necessary files. If errors are detected, it is possible to roll back to older versions of the same file. If a release is to be productized, all incorporated files are tagged with the release name. With this special tag it is possible later on, to rebuild this particular version regardless of all changes that occurred after the tagging took place.
One of the most popular version control systems is Subversion. A very good introduction on how to work with it, is available here. Netbeans supports the use of subversion or cvs directly. There are no other client side tools or libraries necessary.

Even for smaller projects the use of a version control system should be mandatory.

Deployment Environments

To ease the development, test, live cycle, it is favorable to decouple the different stages. This is done by creating so called environments. Most often a test, integration and production environment is setup.
The software developer should be capable of changing whatever piece of code he wants to modify. The testers prefer to have a stable environment and stable software. And the life systems definitely shouldn't be touched without having tested and certified the software package to be released.
The requirements of the environments are different and can be tweaked to the particular needs. The requirements of the development environment for example aren't that high. Very often this environment is implemented by means of Vmware Images. Depending on the selected resources, it might be necessary to choose a particular operating system or to use more than one Vmware image with different operating systems. Active Directory for example is only available on Windows Server.
The test systems should mimic the production environment as closely as possible. Because you have to be able to check for network or architecture related issues. Usually this means using the same network architecture including router and load balancers, but stick with single systems for the different services. If no load tests are necessary, the use of virtual images is possible even here.
The definite deployment process is different from project to project. Most customer already developed an own understanding on how a deployment should occur. Usually there are several rules. For example it could be that the developer is not allowed to test the software. Or the installation of the deployment package can only be done by the operators.
A process example might run as follows: Development, code changes takes place only in the development environment. Also most of the debugging is done here. After a release candidate is developed, the package is handed over to the test team. They are deploying the release in the test environment and run the pre-documented test cases. If bugs are found, they are documented using a bug tracking tool. The process then loops back to the developers. They are in charge of fixing the bugs. If a release is considered to be stable and no more (critical) bugs show up, the release candidate is approved and is called a release. Now this release can be installed in the life system.

Connecting Resources

Identity Manager projects deal primarily with data. Read the data from one or more sources and write it to one or more targets.
In between the black magic lies in the correct transformation of the source attribute to the expected target attribute. The transformation routines might transform just one piece of information from one particular format into another, e. g. the date from data type long to data type string. Or they take several pieces of information and reformat them into a single attribute, for example the given name and the surname concatenated with a single blank in between forming a new attribute: full name.
For testing purposes it is necessary to have actual data at best in all expected permutations at hand. Furthermore there are resources that check the data, that is to be written to the resource. If a check fails, the write request is not accepted. The real challenge is therefore to supply the suitable resources. Normally the source and target systems exist only once in the enterprises. But it is definitely a bad idea to use these systems for testing or even worse for development purposes. If they are used and if errors occur throughout development, it can even impact the daily work. Consider for example the accidental rewrite of every employees Active Directory password. Not one single log-on will be possible anymore.
Identity Manager has a notion of bulk operations. If you do something wrong, you will apply the incorrect calculation to every user account that might fit the selection criteria. This is the reason why development environments usually foster simulated resources, flat files or if possible work with data copies on separately installed resources of the same type.
If you work on data copies, you have to take into account, that this data is sensitive. If possible, work on anonymized data to deal with this sensitivity aspect. On the other hand anonymizing is changing the semantics of the data. It is possible, that the data quality can't be determined anymore. To test the Initial Load and Reconciles against copies of the unaltered original data is a good approach. The routines and data related results can be used to figure out further processes and recommendations for the data owners of the particular resources. Then the Identity Manager repository should be re-initialized to remove the actual data from the development and/or test environment.

Automatic adjustment of the Resource Definitions

The XML objects targeted for various environments differ mostly in the configuration attributes. These are host names, proxy users, passwords, table names or, if used, resource specific encryption keys.
IDM IDE and respectively the underlying CBE support the definition of template strings. All strings for a particular CBE target are aggregated into a file. Within the build process the templates are substituted with patterns for the selected CBE target, defined in the files. With this approach it is possible to build deployment packages for different target environments without manual interferences.
As an example assume a file with the following content:

In the XML object defining this resource the template string now can be used:

<ResourceAttribute name='Host'

Object ID Management

One of the more subtle issues around producing a customized IDM release is the management of XML objects' object IDs. They are normally created automatically by IDM during the creation or import of a new XML object and therefore are not actually needed in the original XML document. There may be situations however where you need to know the object ID of one object to put it into a reference to that object inside another object.
Also, you may not know from the start to what objects you may need to build references later on. Therefore it's good practice to always select an object ID manually and put it into the XML object. This ensures, that references will never go astray and makes the whole process easier to understand. Therefore it gets manageable and more consistent. In fact IDM IDE checks for the presence of Object ID's while importing and issues warnings if they are missing.
The building of Object ID's should be done according to a naming policy. Stick with it throughout the project. A commonly used process would form IDs like this: #ID#:<object-type>-<short-name>:<version>. The version is relevant here for cases, where you may need to change the object ID. You will need this when updates of TaskDefinitions for workflows with live instances are involved. IDM will not normally allow you to change those TaskDefinitions for consistency reasons.

Naming rules

If more then one person is working on a project it is common practice to define project specific naming rules. To stick with the rules helps to identify a file and it's type just by the name. If the attribute naming is consistent it is much easier to identify attributes and map them between resources and the internal namespace. It doesn't matter what the definitions are as long as every participant comply to the definitions.
To be as portable as possible omit spaces and non-ascii characters in filenames, it makes life much easier.

XML files

XML file names strictly follows the following patterns:

Object Type Name Pattern
Admin User <ProjectName>-User-<Name>.xml
Forms <Projectname>-Form-<Name>.xml
Organizations <ProjectName>-Org-<Name>.xml
Resources <Projectname>-Res-<ResourceType>-<Name>.xml
Rule Libraries <ProjectName>-RuleLib-<Name>.xml
Rule <ProjectName>-Rule-<Name>.xml
Correlation Rule <ProjectName>-CorrelRule-<Name>.xml
System Objects <ProjectName>-Obj-<Name>.xml
Task Definitions <ProjectName>-Tdef-<Name>.xml
Object Ids

Object ID's (see preceeding chapter) strictly follows the following patterns, note that the version is only mandatory if it's a Task Definition. The Object Types of the above definitions is used.

If pre-existing ID's are changed it is a good idea to clean out the repository before redeployment. This might not be necessary but it helps a lot if by example the ID's of resources are modified.
Java files

Java files and class names follows the standard notation of Java (camel casing). The first word starts in lowercase every new words starts with a capital letter.
This is not so easy for non english languages as by example german fosters the concept of compound words. Therefore its not that easy to figure out where a new word begins.

Very often it's a good idea to agree on english as the language for naming and documentation.
Source code documentation

Yes, it's a good idea to document your code, regardless if its XML or Java. For Java there are quite some howto and/or best practice documents available. See Writing doc comments or this blog entry for more information. Unfortunately that's not true for the XML code.
We can use the XML comments for putting comments into our XML document where ever we want.

<!-- This is a comment --> 

Unfortunately this comments are stripped off when the file is parsed and re-written by the IDE. That means every time you switch from XML to Design View in the IDE and do something. Even re-arranging the little images is enough. When now switching back to XML view, the file is re-written and all XML comments are stripped and gone...
To address this short coming the waveset.dtd defines a <comment> tag that can be used at a lot of places, but not everywhere.

Nevertheless I prefer to still document the code with XML comments and use the little tricks that are possible with a real IDE like netbeans. Netbeans is recording each and every modification that is done to the code and gives the developer the opportunity to undo/redo what took place. If you look on the following images you can see the little red triangles and the light blue square at the left of the screen. If you click on either one netbeans shows what has been done and you can role back. It's cumbersome and annoying but works.

My XML files starts with a header like the following

<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE TaskDefinition PUBLIC 'waveset.dtd' 'waveset.dtd'>
<!-- _______________________________________________________________________ -->
    <ObjectType>    <Name of your module>
    Arguments:      <A list and description of my expected arguments>
                    <A short verbal explanation what this module is expected to do>                  

                    <Return values if any, including errors>                                       

                    <if there are open issues, state them here>  

Development Environment

This environment usually consists of virtualized servers with support for all necessary resources. According to the project approach these resources are either real installations or are mimicked using flat files or simulated resources. Larger systems like, for example, a SAP system are rarely installed. A database or Active Directory is more easily incorporated. Developer don't work directly on these systems. They are used as a runtime environment.

Test Environment

This environment is used to check different aspects of the solution. At first there is the functional test of the application. Act the workflows and forms according to their specification? Secondly the initial load and the reconciliations have to be tested with data from the life system. Thirdly the data quality has to be checked. Is data cleansing necessary on the various resources before they can be connected? Next the semantics of the written data has to be checked. Are the attributes written by Identity Manager in the same format as the natively written ones? Can the local application deal with it and does it perform the way it is expected? In case of Active Directory can a newly created user account be used for logging on a windows workstation?
If load tests should be conducted or the time needed for data import should be determined, this environment must mimic the production environment as close as feasible. With test cases like this it must be taken into account, that the Identity Manager performance is largely dependent on the performance of the connected resources.
The test environment very often consists of genuine resources. If you are lucky as a real separate test system. If you are not so lucky as a subsystem, a cloned database table or a directory subtree on the production systems. Note that the use of extra tables or subtrees on the production systems can impact daily operations. Some Active Directory attributes for example do have impacts on the server as a whole, even if these attributes are written only to a specific subtree. The much better approach is to separate production and test environments.
As the Identity Manager and especially the Access Manager are web based applications, the production environment is usually configured in a layered 3-tier architecture. This architecture has large impacts in regards to the configuration of the different systems of the over all architecture. As these configurations can be very complex, they are error prone and you would prefer to have the architecture as similar as possible. This is mainly tied to the number of different servers, SSL termination points, redirect configurations and the configuration of the load balancers.

Production environment

At the customer site the overall architecture of the production environment is usually already fixed. They have several servers around that can't be changed. A preferred vendor is selected and default server systems that have to be used are defined. Very often even the application server is defined. The project and the server components have to integrate into the existing landscape and only if there are very good technical reasons, an exception is acceptable.
Very often the deployments can't be done by the project team. Then the operation admins are responsible for this. Keep in mind that they have to have the appropriate Know-How or it must be acquired on time.
Another issue is to incorporate the Identity Management systems into the already existing system management and monitoring.
After the deployment the initial load of the data and/or the reconciliations take place. It should be done as specified and already tested in the test environment. As this process might be quite time consuming, it is recommended to look for an appropriate time window to interfere as little as possible with daily operations. Ideally all the tests defined and tested in the test environment are applied to the freshly installed production environment.

Developer Workstations

A current PC or Laptop equipped with quite some memory, an internet connection and access to the version control system and the various environments is sufficient as a development workstation.
The operating system doesn't matter that much anymore, as most of the current IDE's and tools are available for most of the popular OS. Netbeans for example is available for Solaris, Windows, different Linux flavors and Mac OS.
After the initial installation of the IDE the developer should do a first check out of the project bits and pieces from the version control system. Now he is ready for creating new files, modifying or deleting existing ones. You commit the local changes and store them into the repository. So the new versions are accessible by other developers.
If the developer workstation is equipped with even more memory (2 - 4+ GB), it's possible to run local target environments as virtual images on the same machine.

Development Tools


All files including the XML files and Java or JSP files can be created and edited with your favorite editor of choice. Stories are told that the hard core developer is still using basic vi and is quite happy with it. We might prefer to use a so called Integrated Development Environment (IDE). Tools like these come with wizards, assistants, context sensitive help, code completion and the integration of debug, profiling and other tools.
Examples for IDE's are Netbeans or Eclipse.
Identity Manager up to Identity Manager 7.1.x is coming with a bundled plugin for the Netbeans IDE. Starting with Identity Manager 8.0 it is a separate Open Source Project. The plugin can be downloaded here. The current stable version (March 2009) is tested and supported when used with Netbeans 6.1. Nevertheless I run the same plugin succesfully with Netbeans 6.5/6.5.1. Keep in mind that it is not supported though. A version for the Eclipse 3.3.1 IDE is available at the same web site, it is however currently not a supported product.
Up to Identity Manager 7.1.x the plugin is documented in IDM_DeploymentTools. Starting with Identity Manager 8.0 the plugin should be documented with manuals located at the web site. At the time of this writing the web site documents changes from the 7.1 documentation. A full featured manual is not available yet. Therefore it is best to stick with the above document.

The use and handling of Netbeans is learned best by walking through the available tutorials and how-tos available here.

The IDM-IDE supports to flavours of IDM projects:

  • Identity Manager Project
    This is recommended as the primary project for Identity Manager deployments. It includes the CBE (Configuration Build Environment) and supports editing and debugging of Identity Manager objects, Java, and JSPs. In addition, it supports an embedded repository and is integrated with Netbean's embedded application server.
  • Identity Manager Project (Remote)
    This is a lightweight Identity Manager project. It is primarily intended for debugging and making small modifications to an external Identity Manager installation. It only supports Identity Manager objects, not Java or JSPs.

The latter project type relies on a so called IDE Compatility Bundle. This file is located as "sample/" in your Identity Manager Installation directory. For Identity Manager 7.0, the compatility bundle may be downloaded from the Identity Manager 7.0 product download webpage.

The is version independed and can be used for IDM Versions 6.0SP3 onwards. This ist true starting with the bundle from IDM 7.1.


The underlying build process as utilized by Netbeans makes use of a tool called ant. Ant is a build tool developed by the Apache Software Foundation. It is similar to make used within the C/C++ development cycle, but completely written in Java. Make is using a specially formatted Makefile for defining the build process and the dependencies. Ant is using a XML file, called per default build.xml. The Apache Software Foundations claims that ant is similar to make without its drawbacks. The documentation is available here. Even the ant documentation is discussing the integration into various IDE's.

Before the advent of the IDM IDE, Sun recommended the use of a field developed tool called the Custom Build Environment (CBE). This tool is also based on ant. A very good and still valid article can be found here in german or english.

The current release of the IDM IDE is delivered with several flavors of ant build files. One is targeted for command line use, one for the use with Eclipse and a third one for the use with Netbeans.

Build Process

After the creation of a new Netbeans IDM project a file "Readme.txt" is created in the project root. This file holds the most current recommendations for using the IDM IDE (partly) and the Configuration Build Environment (CBE). This file details the directory structure necessary for a Netbeans project, the build process and the different build targets.

A project directory
The directory <ProjectRoot>/idm-staging/ holds the extracted WAR file of the to be used Identity Manager version. While building the project the content is copied into the directory <ProjectRoot>/idm. Afterwards the content of the subdirectory hierarchy <ProjectRoot>/custom is copied to <ProjectRoot>/idm. A file located in custom with the same name as a file located in idm-staging will therefore be overwritten. That means the content of <ProjectRoot>/custom takes precedence over the content of <ProjectRoot>/idm-staging/.
The file build.xml in the project root is the CBE/ant specific build file. It's targeted for command line oriented use with ant.
The file holds the CBE specific configurations.
The * files are storing the substitution patterns as described above. The patterns are applied to all files from <ProjectRoot>/custom. See the default * files for more information.
The directory <ProjectRoot>/src is used for storing all of the project Java code. The compiled class files are located in <ProjectRoot>/image/idm/WEB-INF/classes.
The directory <ProjectRoot>/custom/WEB-INF/config/ holds all XML objects. After applying the string substitution they are copied to the directory <ProjectRoot>/image/idm. The import file generator is working later on on this directory hierarchy. This tool builds the file <ProjectRoot>/image/idm/WEB-INF/config/custom-init-full.xml that includes every XML object file. It simplifies the import of the XML objects into the Identity Manager repository.
The files <ProjectRoot>/custom/WEB-INF/config/custom-init-preprocess.xml and <ProjectRoot>/custom/WEB-INF/config/custom-init-postprocess.xml are copied to <ProjectRoot>/image/idm/WEB-INF/config and can be used to deal with special situations, where XML objects have to be imported before any other object is imported or after the last object has been imported.
Additional libraries and JAR files needed by ant and/or the IDM IDE are stored in <ProjectRoot>/tools.
If everything went well, the deployment files and XML import files are stored in <ProjectRoot>image/. The directory <ProjectRoot>/image/idm/ holds the extracted deployment file. This is the directory, that is used, when inside the IDM IDE Run Project is selected. This is also true for the Run LH Command.
The <ProjectRoot>/image/doc/javadoc/ directory is the container for the javadoc files, generated from the Java files located in <ProjectRoot>/src.
The hierarchy <ProjectRoot>/image-profiler/idm/ holds a copy of <ProjectRoot>/image/idm. All class and JAR files are instrumented for profiling.
Structuring the project directory

All project specific code and objects can be stored directly in the directories as described above, but it's best practice to structure your code.

Create a directory structure to partition your code further.

A structure used in several projects looks like this

<ProjectName>/custom                             <-- JSP Code
<ProjectName>/custom/WEB-INF/config              <-- XML Objects
<ProjectName>/custom/WEB-INF/config/AdminUser    \
<ProjectName>/custom/WEB-INF/config/Forms         \
<ProjectName>/custom/WEB-INF/config/Organization   \
<ProjectName>/custom/WEB-INF/config/Resources       \ structure example
<ProjectName>/custom/WEB-INF/config/Roles           /
<ProjectName>/custom/WEB-INF/config/Rules          /
<ProjectName>/custom/WEB-INF/config/SystemObjects /
<ProjectName>/custom/WEB-INF/config/Workflows    /
<ProjectName>/custom/WEB-INF/lib                 <-- jar Files
<ProjectName>/idm-staging                        <-- exploded idm.war
<ProjectName>/image                              <-- deployment images
<ProjectName>/nbproject                          <-- build files
<ProjectName>/src                                <-- Java Code
Build targets

Ant, the underlying build tool, works with so called build targets. A target is a symbol, that specifies the file to be build and lists all the necessary files it depends on. If a necessary file is not available, it is automatically build as defined. A build target therefore defines a target, all necessary source files and the way how the target is build from the source.
The following targets are defined in the CBE specific build.xml. There are more, but these are the most important ones:

  • build
    This target creates the project as a whole, but doesn't create a deployment file. The XML objects that have to be imported, are located at <ProjectRoot>/image/idm/WEB-INF/config/. The target is compiling for the default CBE target (sandbox), if it is not overwritten by a command line option or the corresponding IDM IDE property. While building, all Java files are compiled, the pattern substitution is applied to the XML files and they are checked. If configured, the JSP files are also checked. After this target has been run, a full fledged ready-to-run local Identity Manager is located in <ProjectRoot>/image/idm.
    A subsequent Create IDM WAR (IDM IDE) or dist (ant) creates the deployment file.
  • Create IDM WAR / dist
    Creates the deployment file as <ProjectRoot>/image/idm-<CBE Target>.war. If necessary a preceeding build is triggered.
  • clean
    Cleans out the project. It deletes the directories <ProjectRoot>/tmp, <ProjectRoot>/image and <ProjectRoot>/image-profiler/.
  • dist-delta
    A Zip file with the project specific modifications is created. It is called <ProjectRoot>/image/idm-<target environment>.zip.
  • dist-jar
    A Java archive with the project specific compiled Java class files is created at <ProjectRoot>/image/idm-<company>.jar. This file can be copied to <ProjectRoot>/WEB-INF/lib. Then, the class files aren't necessary anymore.
  • jdoc
    Creates the Javadocs from the Java files and places them into <ProjectRoot>/image/doc/javadoc.
The Command Line

Despite the fact that everything can be done from within the IDM IDE, it's good practice to create official builds using the command line. It can be scripted and everything can be logged and documented.
A deployment file is created using the following command:

ant -Djsp.validation.enabled=true -DtargetEnvironment=<target environment> clean dist

If the project specific files should be archived, or for upgrade/migration purposes, a Zip file containing only these modifications can be created with the following command:

ant -Djsp.validation.enabled=true -DtargetEnvironment=<target environment> clean build dist-delta
* files

These files are holding the substitution patterns for the various environments. There is one file for each environment.
A file called is created automatically for every project. Typically for the test and production environment a file and is created. If more or other CBE targets are necessary, feel free to create them.
The patterns from these files are applied to every file from <ProjectRoot>/custom. File types that are listed in the property install.pattern.substitution.excludes defined in <ProjectRoot>/ are excluded. Per default it is defined as follows:


If there are other, project specific, binaries stored somewhere in <ProjectRoot>/custom, these file types must be added to the property. Otherwise the substitution will be applied to the binary and it is more than likely that it is erroneous afterwards.
The pattern substitution can't be applied to XML objects. Furthermore the substitution process couldn't create new XML objects or JSP language elements, like, for example, it is feasible with the C pre-processor. In our case this would irritate the editor and the debugger of the IDE.
By default ant will build for the sandbox CBE target. If you want to build for another target command line options have to be supplied.
Within the Netbeans IDM IDE the CBE target is selected in the project properties. By default it is also defined to sandbox.
If the CBE target has been switched a preceding run with the clean target is mandatory. Otherwise there might be inconsistency in the created deployment files.

Creation of XML import files

The import generator or object builder is a tool, bundled with the IDM IDE. It supports an incremental mode and a full mode.
The incremental mode is used to load the repository with the necessary files throughout the auto-publish.
The full mode creates the import file, including all XML objects for the full deployment.

Note that you still have to import the <ProjectRoot>/image/idm/sample/init.xml file.

The files are created in the directory <ProjectRoot>/image/idm/WEB-INF/config/ as custom-init-full.xml or custom-init-incremental.xml.
The very first line of these files includes the file custom-init-preprocess.xml. The very last line incorporates custom-init-postprocess.xml. These files can be used for special requirements. They can be tweaked manually and allow pre- and post processing of the XML objects. Usually they are and should be empty.

IDE specific components

The deployment package is created for a specific version of Identity Manager. The file <ProjectRoot>/nbproject/ contains the version specific files and meta data. The file is created on project creation time as a sub set of the product deployment package.
Netbeans identitfies an IDM IDE project based on the file <ProjectRoot>/nbproject/idm-project.xml. These two files, and idm-project.xml are mandatory for an IDM IDE project.
For project specific modifications the file <ProjectRoot>/nbproject/private/idm-project-private.xml is used. Properties set in this file overrule the defaults set in <ProjectRoot>/nbproject/idm-project.xml.

Where is <IDM-Root> located when using Run or Debug in Netbeans?

If utilizing the standard Identity Manager Project (not the remote one) and chose to Run or Debug your project the root directory is set to the place where all the bits and pieces are merged: <ProjectRoot>/image/idm. Therefore if you are not sure if all your Java code is really compiled and deployed have a look to <ProjectRoot>/image/idm/WEB-INF/classes.

Setup JSP validation

To ensure that JSP files are syntacticly correct, they can be checked while building the project. To do so a tomcat installation must be incorporated into the project. The delivery package for Netbeans 6.5 contains the glassfish-v2, the tomcat and a prelude of the glassfish-v3 container. By default both glassfish versions are installed. If you wish to install the tomcat it must be explicitly selected. This is done in the modify install dialog of the netbeans installer. If you already installed Netbeans and haven't installed tomcat so far, it can of course be done in a subsequent step. If you 've taken this initial hurdle, it's as easy as copying the necessary jar files to the <ProjectRoot>/tools directory. It's a good idea to place the files under version control.

Test / Debug

Before the advent of the IDM-IDE debuging of an Identity Manager project was essentially being done using the application server log files. The position and the layout of these logs differ a bit depending on the application server. Very often the application server is coming with an administration console, that can be used to browse the logs.
Additionally the Identity Manager provides an own debugging facility for accessing internal informations and data structures. Use the URL http://<server>/idm/debug to enable tracing, have a look at or even modify the XML objects.
Browse IDM_Troubleshooting for in-depth information on how to debug Identity Manager.
With the advent of the IDM IDE, a debugger and profiler with support for single step evaluation is build into Netbeans. See IDM_Troubleshooting for more information on this.

In bigger projects it can occur, that the default values for the Java virtual machine of the application server are not sufficient anymore. As the Identity Manager is just another Servlet application, the general Java tuning tips apply. A good starting point is again IDM_Troubleshooting.

Setup of an Example project

To visualze the whole process an example project is setup using a VMware image. To keep it simple the same image is used for the version control system and for the Identity Manager. To get used to the distributed development approach, Netbeans is installed locally on the host system (e.g. Laptop).

Setup the Servers

Over time the workshop was used in different projects with different system requirements. To keep the checklist simple, the OS specific parts were relocated into an appendix. See Setup Using Open Solaris, Setup Using Redhat Enterprise Linux 4, Setup Using Red Hat Enterprise Linux 5.2 for more information.

Version control system

For this example project the subversion package is chosen as version control system. Very often it is already packaged with unix based operating systems. If not it is available on the web for nearly every OS. The tigris web site is a very good starting point, the most recent Solaris binaries are usually available on If you prefer to work on Windows systems here is a very good howto and links to the needed binaries.
The Subversion system can be accessed using different protocols. Over the years http(s) access has proven to be the preferred one. For the implementation of http(s) access to a subversion repository, a web server, and a web dav module is required.
To run a Subversion repository disk space is needed, preferably with integration into a backup system. The burden on the CPU can be neglected. Very often it can be installed as an additional service on an already existing server.
The filesystem of the repository can be structured freely. But according to the subversion book the following layout can be considered best practice:


For an explanation of the sub directories branches, tags and trunk and how they are used through the development process, see the subversion book.

Installation Checklist

In general the setup requires the following steps:

Installed in the vmware image  
Component Instructions
An OS OpenSolaris RHEL4 RHEL5
apache 2  
Application Server  
java 1.5.x  
Locally on the Laptop  
Component Instructions
Netbeans 6.1  
idmide 8.1  
VMware was chosen over VirtualBox because only bridged networking was available on several host platforms. With the availability of VirtualBox 2.x this will change.
Creating the IDM Repository

Identity Manager is coming with a set of sample scripts for the creation, update or deletion of the repository database. These scripts are stored in the <IdmRoot>/samples directory. By default a database user waveset with the password waveset is created. Make sure to change it to your needs.

Make sure that a password for the user root is set after installing mysql.

The Identity Manager database is created with the following command:

cd <IdmRoot>/sample
mysql < create_waveset_tables.mysql -u root -p

If you want to use the password dictionary feature of IDM make sure to create the needed database:

cd <IdmRoot>/sample
mysql < create_dictionary_table.mysql -u root -p
Identity Manager Deployment

Deployment can be done in several different ways. If the deployment file is not stored on the same server as the application server is running on, it's easiest to use the admin interface of the application server. For the Sun Application Server this is

If you don't want to change any default, the auto deployment feature of Sun Application Server can be used. If you have to tweak some defaults, use the deployment command. Note that both auto-deployment and the deployment command need the deployment file to be available on the server the application server is running on.
As we want to change the contextroot from the default (which is derived from the name of the WAR file), to something more suitable, we use the deployment command.

asadmin deploy --user admin \
               --passwordfile /opt/SUNWappserver/password.txt \
               --contextroot idm idm-
Populate the repository

The basic configuration and the default objects, that come with the product, have to be imported to the repository. They are delivered as the already discussed XML objects and have to be imported using a bundled command line tool. As the just deployed instance of the Identity Manager is already configured to our needs, it is easy to use it.
On the machine the application server is running on, type in the following commands:

cd /opt/SUNWappserver/domains/domain1/applications/j2ee-modules/idm-
export WSHOME=`pwd`
sh bin/lh import sample/init.xml
This example fosters a Unix based system. For windows adopt the commands accordingly
In some versions of the Identity Manager a user "recon" sometimes got lost. Check for it (just try to log in) or re-import the file samples/admins.xml as a precaution.
sh bin/lh import sample/admins.xml

If several instances of Identity Manager are to be installed, make sure that repository data is imported just once. Only the WAR file must be deployed on every application server.

The basic installation is now finished. For a first impression access the Identity Manager on http://<server>:8080/idm/.

The default credentials of the admin user
user: configurator, password: configurator
Make sure to change the default password as soon as possible.
Exporting the Server Encryption Key

As a new random Server Encryption Key has been created while populating the repository, this key must be exported and stored into your project directory.
To do that login to Identity Manager and select Server Tasks from the tabs. Select Run Tasks from the next line of tabs and select
Manage Server Encryption from the list of available tasks.

The resulting XML file contains all ServerKeys that are currently defined and can hold an unlimited number. Therefore if you have to re-import the repository just export the newly created key and merge it with the keys already in this file.
This file is located in <ProjectRoot>/custome/WEB_INF/config/SystemObjects/ServerKeys.xml.

Installation of a development workstation

This is an easy one. The system should have at least one GB of memory - the more the better. All of the current workstations, PC's or Laptops do have enough CPU power to drive the Netbeans IDE. Therefore the setup of a development system requires just the installation of Netbeans and the IDM IDE plugin. The operating system doesn't matter that much anymore. Netbeans is available for nearly every current operating system.

Installation of Netbeans

Download the software here. As of this writing (Mar 2009) the current version was Netbeans 6.5.1. Grab the full blown packet as we need tomcat for the JSP validation, but want to use glassfish for deployment purposes. Starting with Netbeans 6.5 the default deployment container is glassfish-v3. This is because the container is very slim (even slimmer than tomcat) and has very short start-up times. If you want to use another container you have to explicitly select it (See below).
Check that a current JDK is installed on the development workstation. Make sure to have it close to the version, that later must be used on the production environment. Sometimes, especially between major versions, the packaging has changed and jar files bundled with one JDK are not available with the other. Now install Netbeans according to the packaging mechanism of your operating system. Make sure to modify the installed options to include tomcat. This can be done by selecting the option "modify" in the installation program.

Installation of the IDM-IDE plugin

Download the package from the web site.

As of this writing the 8.1 version is supported for Netbeans 6.1, but has also successfully been used with Netbeans 6.5/6.5.1

Fire up Netbeans and select the Plugin Manager to install the downloaded module. See the documentation for more in depth information.

My first Project

For demonstration purposes a very simple self contained example was chosen. A user self registration will be implemented. A user yet unknown to the Identity Manager has the opportunity to register himself. The sources are available here.

Creation of a new Project in subversion
The following is usually the responsibility of an administrator, it is rarely done by the developer itself

On the version control system a new repository is created for this customer/project. The basic structure is created and everything checked into subversion.

svnadmin create /var/www/svn/<ProjectName>
chown -R apache.apache /var/www/svn/<ProjectName>
cd /tmp
svn co http://localhost/<svn/ProjectName> -username demo1
cd <ProjectName>
mkdir tags branches trunk
svn add tags branches trunk
svn commit -m „"
rm -rf <ProjectName>
This example is using the command line tools on a unix based system. There are several graphical subversion clients available that can be used instead
Create a version controlled project in Netbeans

Create a directory, that will hold all Netbeans projects on the development workstation, and checkout the basic project structure, that has been created on the version control system already.

mkdir <HomeDir>/NetBeansProjects
cd <HomeDir>/NetBeansProjects
svn co http://<SvnServer>/svn/ProjectName

Now start netbeans and create an IDM IDE project. As the directory <ProjectName> is already version controlled, the status of the different files is immediately visible.

Use the menu File/New Project" to run the project dialog window. Select a Web / Identity Manager Project. On the following pages more information concerning the project is collected. The project location is selected to be <ProjectName>/trunk.

Now the location of the product deployment file is needed. With the selection of the WAR file, the version of the Identity Manager the project is targeted for is chosen.
Now the position of the embedded repository can be changed. That is the location where the hsqldb files are stored for this project.

Stick with the defaults and finish creating of the project. The window is closing and the newly created project can be seen in the project tab of the left most Netbeans window.
The output windows of the IDE list the actions that are run in the background. If everything went alright, the very last line should start with „BUILD SUCCESFUL".

Select the proper file encodings

If you are also developing Java code and work in a heterogenous environment, with different Operating Systems make sure that all parties involved agree on the same file encoding. Otherwise expect funny results if working with non-ascii charcters like Umlauts (ÄäÖöÜü). Todo so it's a good idea to set the encoding project wide. Make sure you have selected the project tab from the upper left window.

Right click on your project and select "Properties". In the upcoming window select the Category "Java Source" and set the Encoding to the apropriate value, which is as of today most often "UTF-8".

Importing the User Self Registration files

Download the file and extract it to a temporary directory. Copy the content of jsp/user from the Zip file into the directory <ProjectRoot>/custom/user.
Then copy everything from objects into the directory <ProjectRoot>/custom/WEB-INF/config. Select the file tab in Netbeans and your project should look similar to the next picture.

To enable the display of versioning labels, select the Netbeans menu View/Show Versioning Labels. The information in the square brackets shows the file status in regards to version control.

Enabling JSP Validation

If you have created or modified JSP files, it's a good idea to check them before creating the deployment package. The validation is done by simply compiling them. For compilation Netbeans relies on a part of the tomcat package, which has to be integrated into the project hierarchy. The packaged tomcat of the Netbeans installation is a good candidate for integration.

# Integrate tomcat into the project
mkdir <ProjectRoot>/tools/tomcat/bin
mkdir <ProjectRoot>/tools/tomcat/lib
cp <tomcat-home>/bin/*.jar <ProjectRoot>/tools/tomcat/bin
cp <tomcat-home>/lib/*.jar <ProjectRoot>/tools/tomcat/lib

Set the properties in the file <ProjectRoot>/ to enable JSP validation:

jsp.validation.enabled=true tomcat.home=tools/tomcat
Selection of the internal deployment container

Netbeans supports an indefinite number of deployment containers, that can be used for local deployments. While installing, we decided to install all of the packaged runtime containers. Therefore we now have to configure if glassfish v2, glassfish v3 or tomcat should be used. With Netbeans 6.1 the default container is tomcat, with Netbeans 6.5 it is glassfish-v3 (prelude version).
To explicitly configure a container, edit the file build-netbeans.xml and add an attribute, serverID to the property idmide.simpleDeploy.
If you simply have to decide which type of internal deployment container should be used, stick to the serverID attribute.
They are

  • Tomcat60 for tomcat,
  • J2EE for glassfish v2, and
  • gfv3 for glassfish v3.

As the new glassfish-v3 is really fast on startup I explicitly select it wit the following statement:


If you do have several instances of one container type, you have to differentiate which instance to use. Netbeans supports a different attribute to the idmide.simpleDeploy property for doing exactly this, it's called serverInstanceID.
Unfortunately the syntax isn't documented anywhere (at least I haven't found it), but if you register several applications servers in your Netbeans and "Run" a project without pre-setting serverID or serverInstanceID the correct settings are written to the Netbeans log output window.

On my Mac with a local glassfish installation and the 3 internal containers they are defined as follows:


Therefore to explicitly select my local glassfish-v2 container, set the following:

Up to and including Identity Manager 7.x the use of glassfish is not recommended.
The sandbox environment is using a hsql database for the repository. This database is used in a way that consumes lots of memory.
This was fixed with IDM 8.0 and IDM IDE 8.x.
Even with the 8.x fixes the use of the internal hsql database requires tons of memory.
If you are running in Java Heap Space errors while working under control of Nebeans, try increasing the -XmsNNNm -XmxNNNm options of the Jvm.
I found values of above 768 worth a try. I am running my Netbeans with 1024 right now.
Putting the project under version control

As we checked out the project from Subversion before we created the Netbeans project, the directory is already under version control. That's the reason why we already can see labels like [New] aside the directories and files. Now the challenge is to exclude some of the directories from being stored to the subversion repository on commit. These are directories, that are strictly local or are recreated on every build and can be considered temporary. In our case this are the directories image, tmp and nbproject/tmp, located in the <ProjectRoot>. It's as easy as clicking the directory in the file tab and select the menu entry Versioning/Ignore.

To store everything to the repository, select the file tab, click on the <ProjectRoot> and select the menu Versioning/Commit. A window is popping up asking for the commit message that should be added to the files for this commit. The very first run takes quite some time as the whole staging directory is also transferred to the repository. If the repository is write protected like ours, another window pops up asking for the credentials to log into the subversion for write. The credentials are being cached, subsequent commits won't ask for the credentials again, as long as they are not changed on the repository.

If you forgot to ignore the above mentioned directories and commited them to the subversion repository, you will run into trouble. Subversion creates hidden sub-directories (.svn by default) in every directory it controls.
On the next clean command the whole <ProjectRoot>/image_ directory is deleted and of therefore all of the .svn sub-directories.
If you now try to commit your project subversion will complain that the .svn directory is missing.

To deal with this situation you have to delete these directories on your subversion, delete/move them locally and commit to subversion again.

# svn rm http://<SvnServer>/svn/ProjectName/image
# svn rm http://<SvnServer>/svn/ProjectName/tmp
# cd <ProjectRoot>
# rm -rf image
# rm -rf tmp
# svn commit -m "Cleaned out image and tmp directories"
# svn propset svn:ignore image .
# svn propset svn:ignore tmp .

The commiting and ignoring can of course been done in netbeans, too.
If you are now doing a Clean and Build again, the directories will be recreated and should be flagged [Ignored] in netbeans.

The first start of your application using the sandbox

Select the projects tab and click the <ProjectRoot>. Now select first Build/Clean and Build Main Project from the Netbeans menu and then Run/Run Main Project.
This starts the local deployment process and after some time a browser window with the login page of the Identity Manager should popup.
To ensure, that our modification is included, point your browser to the end user interface http://localhost:8084/Idm/user/login.jsp. If everything went fine, the text „New Self Registration? Click here" can be seen on the login page. This link is not included in the original package.

Creating deployment files for other target environments

If your project is running well in your local sandbox, the next step is the creation of an installation package for the test environment. Create the file for defining the environment specific properties. This file is located in <IdmRoot> and has the following content:

# File:
# Target Properties for Test Environment
# Mon Jun 02 11:00:07 CST 2008

# Resource Classes

The property IDM_URL holds the string suitable for our local Vmware Image. If your environment differs, change it to your needs.
The resource classes are not necessary for the chosen example. They are just an example on how to deal with different environments and switch between real resources and simulated resources.
The file defines the resource classes as follows:

# File: sandbox-target.proerties
# Resource Classes
Switching the repository

The sandbox environment uses an embedded repository. This utilizes a java based database system which tends to consume large amounts of memory. If your project grows larger it is common practice to switch to another, external repository, for the test environment we will use a MySQL database.
The Identity Manager reads the necessary information for connecting to the appropriate repository from the file "ServerRepository.xml". This file holds just the jdbc string needed for connecting to the repository. For security reasons it is encrypted with the default server encryption key.
To switch to a new repository you simply have to rewrite this file with a jdbc string pointing to the repository of choice.
Besides other things the Identity Manager lh command located in "<IdmRoot>/bin" supports the creation and modification of ServerRepository files.

As different deployment environments most likely utilizes different repositories, the management of the ServerRepository.xml files is integrated into the CBE target mechanism of IDM-IDE. The * files foster a special attribute that is used by the deployment mechanism. See the following box for an example.


If this line is present, this file is taken and copied to "<IdmRoot>/WEB-INF/ServerRepository.xml". It is common practice to name the files according to the name of your target environemnt. In the example above, the name of the target environment is prepended to the default name. Note that the validity of the content is not checked.

These files can be stored anywhere. One apropriate location is the directory "<IdmRoot>/custom/WEB-INF". If stored here they are version controlled. Another place is the "nbproject/private" directory. Files stored here are usually not version controlled and can be considered private. Your private sandbox file is usually located here.

If you are currently using the internal HSQL database and you print out the current ServerRepository.xml file, a line like the following will be shown:

HSqlDBDataStore:jdbc:hsqldb:file:/<directory path to your repository>

if you have to recreate it use the following command syntax (which is currently missing in the documentation). Note that the -f parameter is only the (pre-existing) path to the repository. The database files are named idm.*:

cd <IdmRoot>
export WSHOME=`pwd`
sh bin/lh setRepo -tHSQL -f/<PathToYourRepository/ -n -o custom/WEB-INF/sandbox-ServerRepository.xml

Now for switching to another repository simply issue the following commands. In our example MySQL is used, if another database system should be used, the command line must be tweaked accordingly.

cd <IdmRoot>
export WSHOME=`pwd`
sh bin/lh setRepo -t Mysql -ujdbc:mysql://localhost/waveset -n -o custom/WEB-INF/Test-ServerRepository.xml
Keep in mind that you still have to create the repository on the database system where the jdbc string in the ServerRepository.xml file is pointing too
After you switched to an external repository the IDM-IDE commands for dealing with the internal (HSQL) database, won't work on your external repository. You have to utilize what ever tools come with your database
Creating the deployment package

The deployment package for the freshly created CBE target can be build using the command line or the graphical user interface.
On the command line run the following command:

ant -DtargetEnvironment=Test -Didm.staging.dir=idm-staging

In the Netbeans IDM IDE right click the project root in the project tab. Select the project properties from the context menu.

Select your target from the drop down list Current CBE Target. Netbeans creates an entry for every * file found in <IdmRoot>.
After the window was left clicking Ok, a clean and build must be run. Select the project tab and right click the project root from the context menu select Clean and Build.
Now create the deployment package. From the context menu select Create IDM WAR. After a succesful build the file idm-Test.war is located in the directory <IdmRoot>/image.
Copy this file to the server hosting the Identity Manager for the test environment. To deploy the file from the command line issue this command:

asadmin deploy --user admin --contextroot idm idm-Test.war

or simply drop the file on the autodeploy folder.

Afterwards the XML objects must be imported. To do so, switch to the root directory of the just deployed Identity Manager and import the file custom-init-full.xml.

cd /opt/SUNWappserver/domains/domain1/applications/j2ee-modules/idm-Test
export WSHOME=`pwd`
sh bin/lh import WEB-INF/config/custom-init-full.xml

Setting up additional development workstation

This is even easier. The subversion repository is already in place and we created the project already.
Just install another machine according to the hints of the preceeding chapter.
Then use a subversion client to checkout the project we created.
This can be done on the commandline:

cd /<Your local Project Directory>
svn co http://<SubversionServer>/<svn/ProjectName> -username demo1

or utilizing the freshly installed Netbeans:
Select "Versioning/Subversion/Checkout" from the main menu and supply the needed information to the dialog

Afterwards just open the new checked out project using "File/Open Project" from the main menu and you are done.


Setup Using OpenSolaris

Components chosen

For OpenSolaris most of the components are already included with the distribution or can be installed via the pkg mechnism.

  • OpenSolaris 2008.05
  • subversion 1.4.3
  • apache 2.2.8
  • Sun Application Server 9.1u2
  • MySQL 5.0.45
  • Java 1.6.0_04 b12
Installation of OpenSolaris as a VMware Image

I had to install into VMware Fusion on a Mac and installation wasn't that straight forward. Therefore I will go a bit more in detail here. Even the Vmware manual doesn't help much, VMware explicitly supports only Solaris 10.

  • Download the OpenSolaris 2008.05 LiceCD and install into VMware
  • Choose to install the VMware Tools from the VMware menu.
    As the tools doesn't come as a native package and can't be installed automatically, fire up a shell.
    The VMware tools installer expects the directory /usr/dt/config/Xsession.d to exist - which isn't used anymore.
    Select the correct screen resolution for your setup. I selected 1024x768.
    Answer any other question with the default - just press <Return>.
    # Install VMware Tools
    cd /var/tmp
    tar xvfz /media/VMware Tools/vmware-solaris-tools.tar.gz
    # Create the missing directory
    mkdir -p /usr/dt/config/Xsession.d
    cd vmware-tools-distrib
  • Switch to a german keyboard layout
    It turns out that VMware (like VirtualBox) setup an own keyboard emulation that requires the pc105 keyboard model regardless what your hardware really is. Without the modifications there are certain keys that don't work.
    eeprom kbd-type=German

    Now edit the /etc/X11/xorg.conf file and change the following lines to:

    Option "XkbModel" "pc105"
    Option "XkbLayout" "de"
  • Reboot
    This is not really necessary, a restart of the X11 session should be sufficient.
Installing the version control system

Please add the following packages:

  • SUNWsvn

This automatically adds the packages SUNWneon and SUNWapch22. To enable automatic startup at boot time simply add the Webserver to the SMF Framework.

su -
pkg install SUNWsvn

# Import the manifest for SMF
svccfg -v import /var/svc/manifest/network/http-apache22.xml

# Enable the webserver
svcadm enable svc:/network/http:apache22
Configuring the version control system

A directory for the subversion repository must be created and given to the user that runs the webserver.

mkdir /var/svn  
chown webservd /var/svn

The authentication of users accessing the repository by means of http(s) is done by the web server. Every authentication mechanism supported by it can be used. Additionally subversion is adding mod_authz_svn as authentication module. For this example we will use just basic authentication supplied by the web server.

Security Issue
The password is traveling the network in clear text. For production installations use a different authentication method or use at least https to secure the connection to the server.
See this page for an discussion on how to setup apache ssl mode.

The most basic approach is to store the passwords in a password file. This file is written with command line tools coming with the web server. It is also possible to use other repositories to store the passwords. Most modern versions of apache support the use of a directory server out of the box.
For our installation we just use the password file.

Create the password file

/usr/apache2/2.2/bin/htpasswd -c /etc/apache2/2.2/subversion.passwd demo1
/usr/apache2/2.2/bin/htpasswd /etc/apache2/2.2/subversion.passwd demo2

Depending on the distribution apache is reading its configuration from different files and directories. In case of OpenSolaris the web server expects the module specific configuration files to be stored in /etc/apache2/2.2/conf.d.
The installation process is described in more detail in the subversion book or here by a german guy

Create the configuration file for the apache subversion module: /etc/apache2/2.2/conf.d/subversion.conf

# Configuration of the Subversion module.
LoadModule dav_svn_module     modules/

# Only needed if you decide to do "per-directory" access control.
#LoadModule authz_svn_module   modules/

# Example location directive.

<Location /svn>
   DAV svn
#   SVNPath /home/svnroot
   SVNParentPath /var/svn

   # Limit write permission to list of valid users.
#      # Require SSL connection for password protection.
#      # SSLRequireSSL

      AuthType Basic
      AuthName "Subversion Repository"
      AuthUserFile /etc/apache2/2.2/subversion.passwd
      Require valid-user

Now make the webserver re-reading the configuration:

svcadm restart svc:/network/http:apache22

To check that everything is okay look into the webserver log file at /var/apache2/2.2/logs/error_log. It should contain a line like the following:

[Thu Aug 28 17:23:17 2008] [notice] Apache/2.2.8 (Unix) mod_ssl/2.2.8 OpenSSL/0.
9.8a DAV/2 SVN/1.4.3 configured -- resuming normal operations

To check the installation create a repository on the system and import a file.

# Create a repository 
svnadmin create /var/www/svn/test 
chown -R apache.apache /var/www/svn/test
Never ever write directly to the file hierarchy of the repository,
instead use always the subversion commands! Either from the command line or as gui tools.

Now the sub directories as proposed by the subversion book have to be created. To do so check out your new, empty repository to a temporary directory.

# Prepare new Repository 
mkdir /var/tmp/Repo 
cd /var/tmp/Repo 
svn co http://localhost/svn/test --username demo1 
cd test 
mkdir {trunk,branches,tags} 
svn add trunk branches tags 
svn commit -m „" 
rm -rf test

As anonymous read access to the repository has been configured, it's possible to list what we just created without logging in.

svn list http://<SVNServer>/svn/test

Now let's checkout, what we created

mkdir Work 
cd Work 
svn co http://<SVNServer>/svn/test
Installation of an IDM run-time environment

For our example installation we do need a run-time environment where the IDM respectively our project will be deployed. Glasfish is installed into the same image, that is already used for the version control. Glassfish is preferred over tomcat, as the administration interface is much more sophisticated.

Installation of a JDK

Today it's very common, that a JDK is already bundled with the operating system and installed by default. Unfortunately it is also common, that the installed version is a bit out of date. Check that the version installed is at least Java 5 and that the necessary environment variables and search path are correctly set.
With OpenSolaris everything is fine, the included JDK Version is 1.6.0_04 b12.

Installation of Glassfish

Glassfish is available from the OpenSolaris package repository. Installation is very simple:

pkg install glassfishv2

The package does not create a domain nor does it configure the service for automatic startup. Lucky enough glassfish supports a command for the creation of SMF manifest files. Most of the glassfish commands expect the passwords to be stored in a file therefore we create a file and change the permissions to readonly for the particular user the application server is using.

Security Warning
For simplicity this example is using the root account for glassfish installation. Don't do that in production environments.

Create the password file /root/.asadminpw


Now create the domain and the SMF manifest file and enable the application server

# Create the domain
asadmin create-domain --user admin --passwordfile /root/.asadminpw --adminport 4848 --instanceport 8080 domain1

# Create the manifest file
asadmin create-service --passwordfile /root/.asadminpw

# Import the manifest and enable the service
svccfg import /var/svc/manifest/application/SUNWappserver/domain1_var_appserver_domains/Domain-service-smf.xml
svcadm enble svc:/application/SUNWappserver/domain1:default
Installation of MySQL

OpenSolaris does not come with a preinstalled MySQL database. Instead it supplies MySQL and MySQL5 by its package repository. Therefore installation is as simle as the installation of the other components:

pkg install SUNWmysql5

The package comes already with a manifest file. To enable the database start through the boot process do:

svccfg import /var/svc/manifest/application/database/mysql.xml
svcadm enable svc:/application/database/mysql:version_50

Unfortunately the binaries are stored in a seperate directory which is not included in the search path. I am a bit lazy and prefer to have the commands right at hand. For adding the binary location to your environment put the following to your

if ! echo ${PATH} | grep -q ${MYSQLBIN} ; then

As the default character set for MySQL should be utf8 modify the /etc/mysql/my.cnf configuration file and add the options necessary.
Add the following two lines to the the [mysqld] section:


As Identity Manager is using InnoDB tables, check that they are enabled. In the default configuration file used above the innodb related lines are commented by default. Uncomment every configuration line that starts with innodb. A subsequent restart is necessary for re-reading the configuration.

svcadm restart svc:/application/database/mysql:version_50

Setup Using Redhat Enterprise Linux 4

Components chosen
  • Redhat Enterprise Linux 4
  • subversion 1.4.6
  • apache 2.0.52
  • Sun Application Server 9.1
  • java 1.5.0 b15
Installation of RHEL4 as a VMware Image

Creation of a Vmware image is considered to be straight forward and won't be discussed here. If you need support, consult the RHEL4 Installation Guide and the Vmware manual. For the next step the operating system should be installed already, a user account and the root account should exist.
Installing the version control system

Please add the following packages:

  • httpd-2.0.52-25.ent.rpm
  • apr-0.9.12-2.i386.rpm
  • apr-devel-0.9.12-2.i386.rpm
  • subversion-1.4.6-1.i386.rpm
  • mod_dav_svn-1.4.6-1.i386.rpm
Configuring the version control system

First of all the apache module mod_dav_svn must be configured.
Create the root of the repository and give it to the account, the web server is using.

mkdir /var/www/svn 
chown -R apache.apache test

The authentication of users accessing the repository by means of http(s) is done by the web server. Every authentication mechanism supported by it can be used. Additionally subversion is adding mod_authz_svn as authentication module. For this example we will use just basic authentication supplied by the web server.

Security Issue
The password is traveling the network in clear text. For production installations use a different authentication method or use at least https to secure the connection to the server.
See this page for an discussion on how to setup apache ssl mode.

The most basic approach is to store the passwords in a password file. This file is written with command line tools coming with the web server. It is also possible to use other repositories to store the passwords. Most modern versions of apache support the use of a directory server out of the box.
For our installation we just use the password file.
Create the password file

htpasswd -c /etc/httpd/subversion.passwd user
htpasswd /etc/httpd/subversion.passwd user2

Depending on the distribution apache is reading its configuration from different files and directories. In case of RHEL4 the web server expects the module specific configuration files to be stored in /etc/httpd/conf.d.

Create the configuration file for the apache subversion module: /etc/httpd/conf.d/subversion.conf

# Needed to do Subversion with Apache server. 
LoadModule dav_svn_module modules/  

# Only needed if you decide to do "per-directory" access control. 
#LoadModule authz_svn_module modules/  

<Location /svn> 
   DAV svn  
#  SVNPath /home/svnroot 
   SVNParentPath /var/www/svn  

#  Limit write permission to list of valid users. 
#      Require SSL connection for password protection. 
#     SSLRequireSSL  
      AuthType Basic 
      AuthName "Subversion Repository" 
      AuthUserFile /etc/httpd/subversion.passwd 
      Require valid-user 

Now make the webserver re-reading the configuration:

/etc/init.d/httpd restart

To check the installation create a repository on the system and import a file.

# Create a repository 
svnadmin create /var/www/svn/test 
chown -R apache.apache /var/www/svn/test
Never ever write directly to the file hierarchy of the repository,
instead use always the subversion commands! Either from the command line or as gui tools.

Now the sub directories as proposed by the subversion book have to be created. To do so check out your new, empty repository to a temporary directory.

# Prepare new Repository 
mkdir /var/tmp/Repo 
cd /var/tmp/Repo 
svn co http://localhost/svn/test --username demo1 
cd test 
mkdir {trunk,branches,tags} 
svn add trunk branches tags 
svn commit -m „" 
rm -rf test

As anonymous read access to the repository has been configured, it's possible to list what we just created without logging in.

svn list http://<SVNServer>/svn/test

Now let's checkout, what we created

mkdir Work 
cd Work 
svn co http://<SVNServer>/svn/test
Installation of an IDM run-time environment

For our example installation we do need a run-time environment where the IDM respectively our project will be deployed. The Sun Java System Application Server is installed into the same image, that is already used for the version control. The Sun Java System Application Server is preferred over the tomcat, as the administration interface is much more sophisticated.

Installation of a JDK

Today it's very common, that a JDK is already bundled with the operating system and installed by default. Unfortunately it is also common, that the installed version is a bit out of date. Check that the version installed is at least Java 5 and that the necessary environment variables and search path are correctly set. In our case with RHEL4 a JDK version 1.5.0 Update 15 is installed, which is sufficient. Unfortunately the environment is not set up correctly. Different distributions have different approaches to configure the environments, please check the documentation. For RHEL4 the following script, stored at /etc/profile.d/, can be used to set up the environment.

if ! echo ${PATH} | grep -q ${JDK} ; then
         export JAVA_HOME fi
Installation of the Sun Java System Application Server

Grab the software from the Sun Download pages and install it. By default the bits and pieces are installed into /opt/SUNWappserver.

Default credentials
user: adminpassword: adminadmin

For installing the software simply fire up the selfextracting binary

sh ./sjsas-9_1_02-linux.bin
For installation only, the Sun Application Server needs a basic X-Windows setup, this is not true for glassfish

Again if you prefer to have the administration binaries in your search path, create a setup script /etc/profile.d/SUNWappserver:

if ! echo ${PATH} | grep -q ${APP} ; then

If you want the application server to be started automatically at boot time, another startup script for /etc/init.d and friends is needed. As the application server needs a password for startup, a password file has to be created. Make sure it is read-only. I store it at /opt/SUNWappserver/password.txt


Now the startup script /etc/init.d/sunwappserver:

# "$Id:,v 1.00 2008/05/06 05:19:16 achimr Exp $" 
#   Startup/shutdown script for the Sun Application Server 
#   Linux chkconfig stuff: 
#   chkconfig: 2345 55 10 
#   description: Startup/shutdown script for the Sun\ 
#                Application Server 9.x 
#   Copyright 1997-2008 by Sun Microsystems, all rights reserved. 
# Source function library. 
if [ -f /etc/init.d/functions ] ; then 
        . /etc/init.d/functions 
elif [ -f /etc/rc.d/init.d/functions ] ; then 
        . /etc/rc.d/init.d/functions 
        exit 0 

start () { 
        echo -n $"Starting $prog: " 
        # start daemon 
        $DAEMON start-domain --user admin --passwordfile $PASSWD domain1 
        [ $RETVAL = 0 ] && touch /var/lock/subsys/sunwappserver 
        return $RETVAL 
stop () { 
        # stop daemon 
        echo -n $"Stopping $prog: " 
        $DAEMON stop-domain domain1 
        [ $RETVAL = 0 ] && rm -f /var/lock/subsys/sunwappserver 
restart() { 

case $1 in 
                [ -f /var/lock/subsys/sunwappserver ] && restart || : 
                status $DAEMON 
        echo $"Usage: $prog {start|stop|restart|condrestart|status}" 
        exit 1 
exit $RETVAL 

Setup the run level links

sudo /sbin/chkconfig --add sunwappserver

Now we are ready to start the new application server with /etc/init.d/sunwappserver start. For a first check try to access the admin interface of the application server.

Sun Application Server Default ports
Administration: 4848Applications: 8080
Some Linux distribution favor a host based firewall. Besides ssh they are blocking every incoming connection by default. As this is also the case with RHEL4 the firewall has to be configured to allow access to port 4848 and 8080. See the documentation and the network configuration tools for this purpose.
Installation of MySQL

By default mysql4 is installed on RHEL4. As there are a lot of issues with mysql 4.x and Identity Manager, the installation is updated to MySQL5. MySQL can be downloaded here.

# Deinstalling MySQL4
sudo rpm -e mysql-4.1.20-1.RHEL4.1.i386 \
            cyrus-sasl-sql-2.1.19-5.EL4.i386 \ 
            perl-DBD-MySQL-2.9004-3.1.i386 \  
            mysql-devel-4.1.20-1.RHEL4.1.i386 \  
            libdbi-dbd-mysql-0.6.5-10.RHEL4.1 \   
            mysqlclient10-3.23.58-4.RHEL4.1 \ 
            MySQL-python-1.0.0-1.RHEL4.1 \ 
# Installing MySQL5
sudo rpm -ivh MySQL-client-community-5.1.26-0.rhel4.i386.rpm \ 

As the default character set for MySQL should be utf8 create a configuration file and add the options necessary. Note that with most distributions a configuration file is not created as suitable defaults are compiled into the binaries. This is fine as long as you are happy with the defaults.

cd /usr/share/mysql 
cp my-small.cnf /etc/my.cnf

Add the following two lines to the the [mysqld] section of the newly created file /etc/my.cnf.


As Identity Manager is using InnoDB tables, check that they are enabled. In the default configuration file used above the innodb related lines are commented by default. Uncomment every configuration line that starts with innodb. A subsequent restart is necessary for re-reading the configuration.

/etc/init.d/mysql restart
Enter labels to add to this page:
Please wait 
Looking for a label? Just start typing.
  1. Dec 27, 2008

    Thanks a lot for this article, it is very interesting! And thanks to promote OpenSolaris

  2. Nov 09, 2009

    This is an excellent source of information for Identity Manager 8.1 in addition to how to setup an IDE to work with Identity Manager 8.1. I learned a lot more about the product while doing the IDE setup and I think its an excellent read for someone who wants to establish a solid platform background on the product and understand elements of the architecture to make sense of more intermediate things down the road. Kudos to everyone who participated in this.

  3. Mar 19, 2010

    This is a very good article. Thanks a Ton!

  4. Apr 28, 2010

    Excellent article, thank you!

    One minor thing I noticed, though, that made me cringe - Java class file names should always be starting with UPPER case letters, just like _C_amel Case does.
    CodeConv - Naming Conventions

  5. Nov 03, 2011

    update to Development Tools
    IDE above

    See - for Oracle Waveset/Sun IDM nbm (netbeans) plugins

Sign up or Log in to add a comment or watch this page.

The individuals who post here are part of the extended Oracle community and they might not be employed or in any way formally affiliated with Oracle. The opinions expressed here are their own, are not necessarily reviewed in advance by anyone but the individual authors, and neither Oracle nor any other party necessarily agrees with them.