Thursday, 20 December 2012
Webinar January 16 2013
Friday, 9 November 2012
Creative thinking events
I met with one of the participants from our summer event, and she described EA as a very exciting and useful tool. She has used Archi to create many maps to help with planning and future proofing of plans. She had had 'coaching' from Sam and Fleur to help her develop her skills and she found this the most useful development activity. It's likely that she will present a short case study about her experience at one of our future events.
Thursday, 1 November 2012
Impact of workshop
Tuesday, 9 October 2012
Creative Thinking
We met with our project partner, Stephen Powell from Bolton, on 18th September to share our plans for the workshop on 6 November with the RSC, and had some useful feedback.
We have planned a second workshop for April 2013 (probably the 9 or 16th) in Newcastle with Andrew Stewart.
Ray and Ian have been working to develop the workshop materials, activities and Powerpoints. Ian and I have created some 'cards' based on the Archi icons to use in the workshop for the initial activity. We hope that by engaging participants with a hands-on activity to create a simple process map with cards they will be able to conceptualise the EA process and be able to transfer to the Archi software easily.
We are planning to deliver a webinar with Stephen Powell sharing our case studies in January. This may be delivered as part of a face-to-face workshop.
I have arranged to meet with one of the participants from our first workshop in July to see how they have been implementing EA
Thursday, 20 September 2012
JISC Meeting
Tuesday, 18 September 2012
Partner meeting
Wednesday, 11 July 2012
ArchiMate - training and spreading the word
There was a lot for the trainer to fit into the two days (in reality it turned out to be just over 1 and a half days thanks to fitting the exam in at the end of day two). I found it useful in giving me confidence in what I have been doing, and helped me understand the relationships involved in ArchiMate, which has been my biggest barrier (I have been relying heavily on the magic connector in Archi!). I also found that it helped my thinking, along side Sams blogs in here! Feedback from the training showed that people wanted to do more examples, although the training did a number of examples throughout the two days it was clear that people involved wanted more.
My colleague and I had a lot of discussion about how we saw the official training, and how it fit with our own intention to deliver some training through the Benefits Realisation funding from JISC. We realised a couple of things for our (much shorter) events:
- Value of building up a model needs to be clear at the start, along with the acknowledgement of the amount of work this can take!
- Working on 'paper' for the first examples is a good idea before introducing a tool like Archi. But 'paper' on its own can cause problems, using 'signed' post-it notes to allow you to move elements around would be useful for discussion purposes.
- There is a quick acceptance that paper will only take you so far before you need a tool like Archi to develop more complex models for examples
- Examples for us need to be relevant to HE, perhaps around Assessment, Course Information and Course Development (three popular areas within Staffordshire University)
- Group work brings out the best discussions as long as they are with individuals with similar experience (having someone in the group with more experience can cause the team to split)
- Group work needs to split people from the same institution into different teams (unless they make up one team, otherwise again the team can split)
Monday, 9 July 2012
Comparing Archimate Views With Process Maps
Throughout the process map there are references to actors, roles, business objects and other entities that we would want to include in the Archimate model. These are highlighted in the text of the process map below.
The highlighted items refer to Archimate elements as shown in the process map below. The overall process becomes a Business Process element in the Archimate view. The swimlanes refer to Archimate Roles. The text describes several Business Objects, software applications and bits of infrastructure like shared network drives (represented as Technology Nodes).
These Archimate elements can be collected into an Archimate view and relationships and inferred elements added to arrive at the view shown below.
Thursday, 5 July 2012
Process Automation and Continuous Integration in the JISC Enable Project
Agile Project Management
- Product owner - who identifies and priorities features to be developed
- Development team - who develop features according to self-organised plans
- Scrum master - who shields the development team from distractions and removes impediments.
The Process Of Delivering A Software Feature
Archimate 'as is' view of feature delivery |
Develop Feature (Tests and Code)
Create Code and Tests Using Integrated Development Environments (IDEs)
- Intelligent coding assistance
- Code generation
- Easy navigation and search of the codebase
- Automated refactoring
- Static analysis
- Support for languages other than Java
- Support for technologies and frameworks we use
Automated Builds With Maven
Automated Tests With JUnit
Source code management
IntelliJ IDEA version control menu |
Test (Local Commit)
Inspect/Manually Test
Commit
Revisions and commit messages - always include a message (unlike some of the examples above) |
Test (Commit Stage)
Continuous Integration with Jenkins
Jenkins dashboard showing build jobs |
Creating a build job from within NetBeans |
Code Quality Analysis
SONAR dashboard gives an overview of quality metrics for a project |
SONAR Time Machine showing trends in quality metrics |
Deploy Artifact
Artifactory options in a Jenkins build job |
Artifactory stores the WAR from every build |
Deploy Application To Staging
GlassFish
MyBatis Migrations
Test (Acceptance)
The tests are created using the Selenium IDE which records browser actions as the user interacts with the application.
Selenium IDE showing recorded interactions |
Using the Selenium IDE, use cases or user stories can be enacted and recorded. These can be saved as code to be run as automated acceptance tests.
Saving interactions as a JUnit test |
With our current setup, the developer runs the automated acceptance tests from their PC. Because we are testing Web applications via the browser we can test from anywhere. If the acceptance tests pass, the application is ready for deployment to the production server.
Deploy Application To Production
Next Steps
We plan to remove manual steps through automation with Gradle builds |
Wednesday, 4 July 2012
Archi Training
Friday, 22 June 2012
Creative Thinking Events
Friday, 8 June 2012
Birth, Death and Resurrection of Senior Management Engagement
At the start of the project a new Executive Pro Vice Chancellor (PVC) for Learning and Teaching had been appointed, who was the initial sponsor of the Enable project. The main role of the Executive PVC, was to chair the Senior Management Working Group (consisting of a number of senior faculty staff (Deans and/or Faculty Directors for Learning and Teaching) and a number of Directors/Heads of Services and senior colleagues.
In addition to the start of a new PVC, the then Vice Chancellor had indicated that she would soon be retiring but had yet fixed a date. This was subsequently confirmed as January 2011. As a consequence, the academic years 2008/9 and 9/10 were characterised by certain amount of “planning blight” and senior managers being (understandably) cautious in the face of impending change.
Not only did the executive start the project in some state of organisational churn, so did the department the Enable team were working from. The Learning Development & Innovation (LDI) team had recently been moved (following an external review) from the University’s Information Service to the Academic Development Institute (led by the Director of Academic Development).
After the first four months of the project the Academic Development Institute was abolished and the LDI team (including all Enable project staff) became a standalone team reporting to the Executive PVC. During this period, senior management engagement with the project was good . There was also considerable engagement from staff involved in the various change initiatives across the University, and from award leaders, programme managers and Faculty business staff and quality administrators.
About 18 months into the project, the Executive PVC left the University (and was not replaced for about a year). Following a fairly lengthy hiatus during which it was unclear (even to the Head of LDI) who the LDI team reported to, it was agreed that the team and the Enable project should report to the Deputy Vice Chancellor through the Director of Academic Policy and Development. Senior management engagement had waned somewhat during the “hiatus” (but “spoke” engagement had remained good), however the Deputy Vice Chancellor became very receptive to the ideas on managing change and sustaining innovation being promoted by Enable, and a good period of senior management engagement ensued. However, this period also coincided with the “last days” of the previous Vice Chancellor and the selection and arrival of the new Vice Chancellor, who took up leadership of the University in January 2011. As a result, although engagement with Enable’s ideals was good, translation of this engagement into action was very difficult. However, this period also saw the opportunity – seized by the Enable team – to initiate the “FLAG” work of Enable on the back of a number of Senior Leadership Team initiatives instigated by the new Vice Chancellor.
In June 2011, a new Executive PVC arrived at the University. At this point, oversight of the LDI team moved into the new PVC’s purview although the Head of LDI continued to report to the Director of Academic Policy and Development who had similarly moved reporting lines. This move created another “disjoint” in senior management engagement as the new PVC obviously had a great many things to take on board and to plan.
By the end of the project, 7 of the 17 people who attended the first SMWG meeting had left the University, including the Executive PVC. Nevertheless engagement was subsequently renewed and the Executive team has now picked up messages sent by Enable including the concept of ‘joined up thinking’, and the development of a Change Management role. This renewed interest was due to the project team able to present a clear message of Enable to the executive thanks to previous experience with communicating with the executive team, along side this the project team were able to use senior management champions to pass the message of Enable on to the executive.
Despite the considerable “organizational churn” evidenced above, a constant and stable factor throughout has been a recognised institutional need to ensure curriculum development is responsive to demand. This included ensuring that policies, processes, and supporting technologies for curriculum/product development were designed in a way that was responsive to the needs of both faculties and learners. This required flexible management of the existing portfolio including the process for creating new product, along with guidelines and workflows to encourage a culture of innovation.
A History of FLAG
Background
FLAG was first raised Flying forward (May 2011), this blog highlighted the reasons why a tool for supporting course developments focusing Flexible Learning, and in consequence all course developments. FLAG (Flexible Learning Advice and Guidance) has been designed as a support tool, designed to address a number of issues highlighted by Enable. To reiterate the issues here:- Difficulty in finding the right advice on course design at the right point
- Knowing which source of information would be the best/ most up to date
- Identification of champions to support stakeholders engaged in course design
- Reduction in faculties having to produce own advice and guidance
- Takes burden off staff to hold expert knowledge in the whole process
Approach
As previously mentioned in the May 2011 post the project team treated FLAG development as an internal project, which included a full project plan with clear roles and responsibilities and a list of relevant stakeholders. In September a new bog, New Product Design, was posted around the approach of FLAG. This blog discusses the issues highlighted from engaging stakeholders across the University with a clear focus on the process of course development, using the baseline information from Enable. This focus with stakeholders helped the project team unpick issues not previously noted by Enable, or reinforced issues noted during the base lining process.The project team spoke to course developers with examples of the ArchiMate models from the baseline that focused on the different stages of course development. For initial interviews with faculty staff the model was printed out that was then drawn on to update the model to what was taking place internally. It is worth noting that the initial models focused on University level processes, by discussing these with the faculties we were able to capture the unique processes from faculties.
Each updated model was then used to create a best practice workflow broken down into three stages of course development Strategic Approval, Planning, Validation (similar to the stages in the Manchester Metropolitan University Accreditation! game, also for a screenshot of the game check out the CETIS Blog) These stages were used to help break down the workflow, even with those stages each workflow was one side of A3 paper! These workflows were then taken around for a second round of interviews, and updates, changes and other aspects of course design were then added to the workflows. For example how the faculties engaged with both Partnerships and Quality needed further modification. This round of interviews helped capture the supporting documents used by staff at different points in the flow and where they needed links to useful documents.
After the second round of interviews with the workflows the project team input the master workflow into the Pineapple system. This then helped the team sharpen the workflow, and the links to supporting documentation. Once the workflow had been completed a draft handbook was written to support the use of the software, and both were given to staff within the Learning Development and Innovation team for testing purposes. Successful completion of the tests resulted in the project team promoting FLAG as ‘in pilot’ with faculties & partners and volunteers from each were collected.
At the start of the pilot the volunteers were asked to complete a short online questionnaire asking about how they managed course developments and whether they felt that they focused on traditional course design. At this point the pilot was blogged again. However since the launch of the pilot a number of changes have occurred in the University causing engagement to decrease. The first was the restructure of faculties and schools from 6 to 4, the second was the change in credit structures for modules, and finally the process itself started to go through some change. The changes to the process had a limited impact on the pilot as they have yet to be approved by the committees and in the long term these changes will be of benefit to the project as by putting them straight in to FLAG we can ensure that course design follows the latest process, with the most up to date support documentation.
Due to these changes, and the length of time it takes to go through course development, the project team have left the piloting teams to work on FLAG at their own pace, with a number of emails every 3 weeks ensuring staff are still happy using the tool. Unfortunately recently the project team have been informed that one course design team have stopped using the tool, and we are in the process of organising a meeting to find out the issues that stopped their engagement. The project team are also organising interviews with other staff engaged in the pilot and developing an exit questionnaire for these staff to find out if their approach has been improved by the use of the tool or whether it helped them think outside of the traditional course development box.
Information about FLAG, its models and work flows have been handed over to two new initiatives in the University, the first is the Student Records System which would store information on courses post validation and the other is the JISC funded XCRI-CAP project. The project team also intend to work with the Document Management initiative to discuss opportunities to further develop the tool within that environment.
Lessons Learnt
By using FLAG as a way of starting conversations about course design within faculties it was clear that the ‘uniqueness’ of each faculty was more of a perception within that faculty rather than the reality. This is important to capture to ensure continuing stakeholder engagement – and can help them realise similarities in behaviour.Start with interviewing senior staff engaged in curriculum design, before interviewing those ‘on the coal face’. This then can highlight the difference between perceived processes and what actually occurs.
It is useful to interview stakeholders in small groups, for example tutors from the same faculty, business and quality administrators from the same faculty, and service teams (partnerships and quality teams are important), before getting a mix of groups together to discuss the models and workflows.
As highlighted in the Flying the Flag blog post be prepared for the pilot to take some time, initial engagement with the pilot was high, however as the course development continued some pilots became disengaged with using the software. This could be expected depending on when course teams feel they need the most support. Continued engagement with the course development teams is required at this stage.
Process ownership can be difficult, especially over a large process such as course design and is often easy to ignore in a project. It is important to get buy in from those involved in managing the process so that they can take ownership of updating the tool when the processes change. It is often easy to think around one process owner, but consider a process ownership team for those larger processes.
Make sure you are clear about the purpose of the project is and what its scope is. Although this was a project within Enable using a project plan really helped communicate the scope and purpose of the project, and how if the project was a success the tool would be handed over for further development/ embedding to the process owners, not left with the Enable team.
Friday, 18 May 2012
Java Development in the JISC Enable Project
Readers who aren't interested in technical details may want to duck out at this point. It's also quite a long post!
The Business Problem
To recap the business problem, the University had identified significant duplication in quality-related business processes. We had gained approval to develop an application to address duplication in external examiner approval and reporting processes. The application would provide:- an interface for creating and updating information to track the approval and reporting processes
- document management capability for sharing the documents and forms used in the processes
- reporting capability to provide on-demand reports for sharing with stakeholders.
The Java EE 6 Platform
We chose to build the external examiner application on the Java Platform Enterprise Edition Version 6 (Java EE 6). Java EE 6 comprises a set of Application Programming Interfaces (APIs) to make development of multi-tiered and distributed enterprise applications easier, simpler and faster. We chose this platform for the following reasons:We needed to do more with less
In recent years, we had been unable to replace staff who had left the team. Demands on the technical team had remained high with new opportunities for innovation needing to be grasped as they appeared. Consequently, it had been a case of needing to do more with less. We are always on the look-out for principles, practices and technologies to maximise efficiency and effectiveness of the team. Java EE 6 had the promise of achieving more with less (and cleaner) code.We wanted the benefits of the Java EE 6 APIs
Prior to Enable, we had used Apache Tomcat. Tomcat implements the Java Servlet and JavaServer Pages (JSP) technologies. Applications we created to run on Tomcat were based on JSPs, servlets and portlets interacting with a database via the Java Database Connectivity (JDBC) API. This involved writing a significant amount of code to manage cross-cutting aspects of our applications like security, transactions and persistence.Using an application server instead of Tomcat, we could use the Java EE 6 APIs and services provide by the application server to avoid a lot of the boilerplate code we would previously have written to manage the cross-cutting aspects. An application server implements the full Java EE Platform so it provides JSP and Servlet implementations and a host of other APIs including:
- Enterprise JavaBeans (EJB)
- Java Persistence API (JPA)
- JavaServer Faces (JSF)
- Java Message Service (JMS)
- Contexts & Dependency Injection (CDI)
- Java Transaction API (JTA)
- JavaMail
GlassFish Administration Console |
We chose to use the GlassFish open source application server because it was the reference implementation for Java EE 6 and the only application server that supported Java EE 6 at the time.
Java EE 6 promised to be simpler than Spring
An alternative to Java EE would have been to use the Spring framework but we decided not to use Spring.Spring was created as a simpler alternative to the overly-complex and invasive programming model of Java 2 Platform Enterprise Edition (J2EE), Java EE's predecessor. It emphasized simplicity of application design through use of dependency injection and aspect-oriented programming. Spring gained widespread adoption and became for many the obvious choice for enterprise Java development. We had some experience of Spring and liked the dependency injection and AOP elements but not use of XML for declarative configuration. Also, the Spring Framework had grown so much over the years that reacquainting ourselves with its large feature set was going to be a non-trivial exercise.
Java EE 6 uses a simplified programming model through use of a convention over configuration approach. With dependency injection, separation of concerns and persistence baked into the platform, Java EE 6-based applications promised to be as lean and mean as equivalent Spring applications, if not more so. Aiming for reduction in complexity, we opted for Java EE 6 instead of Spring.
Architecture of the external examiner system
An Archimate view of the applications used in the external examiners application is shown below.Layered Archimate view of the external examiner application |
GlassFish
Our application server was the GlassFish Server Open Source Edition which is free but follows the usual 'community' model of support, i.e. you solve your own problems with information gleaned from forums, blogs, bug tracking systems, etc.Initially, we tried initially to run all three applications (External Examiners Web Application, Alfresco, JasperServer) on a single GlassFish instance. The memory requirements of the combined applications in production made it impossible to run all three together. Also, Alfresco is designed to run on Tomcat and, although our attempts to get it running on Glassfish were initially successful, each Alfresco release brought new configuration problems so we decided to run a 'vanilla' (default configuration, bundled) Alfresco instance on Tomcat on a separate server to avoid unnecessary configuration work.
MySQL
We chose MySQL as our database software because:- we had experience of it from previous developments
- it is mature, robust and fast
- it has a large community of users and comprehensive documentation
- it has good free tooling available.
Alfresco
We used Alfresco to provide document management services for the external examiners application. We used Alfresco as an interim solution to fill the document management capability gap until a University-wide document management solution is implemented. The University has a Document Management steering group which has identified the need for an institutional enterprise document/content/records management system and gathered requirements for it. Work is progressing to prepare the business case and procure and implement a system.In the absence of a University system, we used Alfresco Community - a free version of the Alfresco open source Enterprise Content Management system. This is another community-supported offering intended for non-critical environments. Alfresco was chosen to:
- provide shared document management functionality for the application
- be similar enough to a University-selected solution to make re-implemention using the University solution easy
- illustrate the value of document management to gain further grass roots support for the document management proposal
- get some experience interacting with a document management solution to inform the University implementation.
Uploading a report to the external examiners document library |
JasperReports Server
JasperReports Server was chosen to provide shared reporting functionality to replace generation of reports directly from the Microsoft Access database and circulation of them by email. JasperReports Server hosts reports created using the iReport designer tool. The server allows stakeholders to run and download reports on demand. We used the JasperReports Server Community Edition which is free and has the usual community supported approach.A report run on the server |
iReport
iReport is a free, open source report designer for JasperReports. We used it to create reports to replace locally generated reports from the Microsoft Access database. We used TOAD for MySQL to visually design SQL queries to return data from the external examiner database for each report and used JDBC datasources in the reports so that the reports hosted on JasperReports Server dynamically query the database each time the report is run.Visual design of a SQL query using TOAD for MySQL |
The report is designed using fields from the database.
Designing a report using iReport |
Editing the XML source of the report |
List of reports hosted on the server |
External Examiner Application
The external examiner application has been developed by the Learning Development and Innovation (LDI) department technical team. It provides an interface for creating and managing information associated with the external examiner appointment and reporting processes and a data import tool to transfer legacy data to the new database.The application is managed as three separate projects to simplify development:
- domain model
- legacy data import application
- web application
Domain Model project
The domain model is a separate project to allow it to be used by the data import application and the web application.'Persistence Plumbing'
The domain model project models the 'things' in the real world that we are interested in and that we want to store and share information about. These are objects like external examiners, tenures, courses, reports, etc. It also includes the object relational mapping (ORM) metadata - the 'persistence plumbing' which allow these entities to be loaded from and saved to the database. We use the Java Persistence API (JPA) to do this with the Hibernate Java persistence framework providing the API implementation.XCRI influence
Information about courses features heavily in the information recorded about external examiners and their tenures. We based the course information in our domain model on the XCRI CAP 1.1 information model . A class diagram of the domain model is shown below (open it in a new tab or window and click it to zoom into the detail).External examiner application domain model classes |
Because we were learning lots of new technologies concurrently, we wanted to keep each aspect as simple as possible. Inexperience with Hibernate made us conservative about how to implement the domain model mappings. We chose to avoid inheritance to keep the hibernate mappings simple which meant that the domain model was a bit more complicated. We replaced inheritance in the XCRI model with composition.
XCRI Course inherits GenericDType, our Course composes GenericDType |
The downside of this was that any changes to the methods of objects being composed required corresponding changes to the methods in the objects doing the composing. Happily, changes to the XCRI objects in the domain model were relatively rare. If we started again today, with our Hibernate experience, we would just include the XCRI information model 'as is', with inheritance and all.
Mapping Metadata
The domain model contains metadata which maps object fields to tables and columns in the database. The mapping for the collection of Tenures associated with an Examiner is illustrated below. In this example, an annotation is added to the getTenures() method of the Examiner class to specify the table and columns that will be used to store the collection of tenures in the database. The Hibernate Java persistence framework can use this metadata to create the database structure when the application is first run. The Tenures collection is represented in the database as the examiner_tenures table, the structure of which is shown in the screenshot.Persistence mapping of examiner tenures to a table in the database |
Integration tests
We have created integration tests to check the persistence mappings. These tests use DbUnit which is an extension to the JUnit unit-testing framework. DbUnit is used to set the database to a known state before each test is run. The tests check that the database is in the expected state when a known object is saved and that the expected object is returned when loaded from a known database state. We use an in-memory HyperSQL database for these integration tests because the tests run faster and no clean up is required - after the tests have run, the in-memory database ceases to exist. The tests are run automatically on each build of the domain model project.Data Import project
The data import application loads data from the legacy Microsoft Access database used by the central quality team and persists it to the shared MySQL database. This application is run once only to import the legacy data before the external examiner application is first used.The main import method of the data import project |
- the name of the examiner who is taking over reporting duties for this award
- an ending date for the award
- the reason that the award is ending.
Web Application project
The External Examiners Web Application provides a user interface for managing information to support the external examiner appointment and reporting processes. A new examiner record can be created or an existed examiner record can be located via the Search screen.Search results |
Clicking on one of the search results takes the user to the Edit screen where information can be entered and updated. On this screen, examiner contact details and tenure information can be recorded. Appointment records can be uploaded to Alfresco via the upload button. Uploaded documents are automatically placed into the correct faculty area. When reports arrive, they can be uploaded to Alfresco in the same manner on the reports tab.
Edit examiner screen showing an examiner's tenures |
Technologies Used in the Web Application
PrimeFaces
We used the PrimeFaces JSF component suite for on-screen components because it is easy to use and it complements JSF by providing more sophisticated components than the default JSF suite. This makes development faster by allowing us to focus on building a user interface from existing components rather than having to design and build custom components. For example, PrimeFaces has a file upload component that we use to upload documents to Alfresco.Seam
As we created the external examiners web application and gained experience in Java EE 6 development, we came to realise that Java EE 6 does not quite live up to its promise. Some aspects, like declarative error handling via the deployment descriptor, simply do not work and other aspects, like dependency injection, always seem to stop short of providing enough flexibility to suit the circumstances of your application. To overcome these issues, we turned to the JBoss Seam Framework to fill in the missing pieces.Seam complements Java EE 6 well because it is based on the Java EE platform and many of its innovations have been contributed back into the Java EE platform. CDI was a Seam idea and the reference implementation of it is included in the Java EE distribution. Seam can be thought of as anticipating the next Java EE and it provides a host of features that you wish had been included in the reference implementation.
The Seam features most important to us were:
- injection of objects into JSF converter classes (via the Faces module).
- easy creation of exception handlers to handle application errors and session expiry (via the Solder module). The orthodox Java EE way to do this, via declarations in the web application deployment descriptor, did not work because the application server wrapped all exceptions in an EJBException making handling of individual error types impossible. Solder unwraps the exception stack and handles the root cause by default, allowing easy creation of methods to handle individual error types and conditions.
Integration with Alfresco
The external examiner web application and data import application integrate with Alfresco via two of Alfresco's RESTful APIs. For example, upload of an examiner appointment form by the external examiners Web application is handled as follows:- When the user selects a file for upload and clicks the upload button, the PrimeFaces upload file component uploads the file to a temporary directory on the external examiner server.
- The Content Management Interoperability Services (CMIS) API 'Get Object' resource is used to return the node reference of the examiner's document folder.
- A multi-part POST to the Repository API 'upload' service is then used to upload the appointment form to the examiner's folder.
Design Patterns
The External Examiner Web Application implements two design patterns that help to simplify the application code. The design patterns are described in 'Real World Java EE Patterns - Rethinking Best Practices' by Adam Bien.Persistent Domain Object (PDO) pattern
The domain model is a collection of Persistent Domain Objects. These are classes which model the real world objects we want to store information about in the database, e.g. examiner, tenure, award, report. These form a rich model of the real world objects including the business logic. This is in contrast to the anemic domain objects typically required for J2EE development. PDOs allow the developer to take an object-oriented approach to solving problems instead of having to work around the 'persistence plumbing' to interact with the domain model. Persistence metadata is added in the form of annotations to specify the mapping of objects to the database. The state of the PDOs are persisted to the database by the Entity Manager. As long as the PDOs remain in the attached state (i.e. managed by the entity manager) they can be modified through method calls and any changes will be flushed to the database when the objects are next saved.Gateway pattern
The Gateway pattern allows PDOs to be exposed to the user interface layer. In our case this means being able to refer to domain model objects directly from JSF pages and components. The snippet below, from the examinerView page illustrates this, with the value of the tenuresTable being a direct reference to the examiner PDOs collection of tenures.Tenures dataTable uses domain model objects directly |
The combination of PDOs and Gateway allows the developer to manipulate the domain model objects cleanly without having to worry about objects persistent state. This results in a cleaner, smaller codebase. High memory consumption is a potential problem if large object graphs are being loaded from the database or there are a high number of concurrent users but for our situation (approx. 20 users) profiling of the application indicated that this was not a problem.
Lessons learned:
- Java EE 6 mostly lived up to its promise of simpler, cleaner, faster development. Significant effort was required to learn the technologies the first time around but subsequent developments on the same platform have been very rapid. Adam Bien's blog is well worth following for insight into 'just enough' Java EE application architecture.
- To truly realise the faster, easier development promise of Java EE 6, you need to augment it with JBoss Seam to fill in some of the missing/broken pieces.
- Basing the domain model on the XCRI CAP 1.1 information model was a wise choice. Although it was a more complex model than we might have created from scratch, we have reaped the benefit of that choice many times. Most recently, a QAA review has requested a change to the level of award detail stored with examiner records. Because of the flexibility of the XCRI-based domain model to represent most course structures, required changes to the domain model have been minimal. In addition, University Quality Improvement Service colleagues have seen the value of representing course (spec) and presentation (instance) separately and have decided to change their databases to fit the XCRI view of the world.
'XCRI thinking' spreads from the domain model to other University databases |
- We used composition instead of inheritance in the XCRI-inspired parts of our domain model because we thought representing inheritance in the persistence mappings would result in an overly-complex database structure. If we started again today, we would just implement it with inheritance.
- Free open source ‘community’ editions of software tend to be fully featured but bugs are more common and get fixed first in the corresponding Enterprise version. You can expect to get what you pay for. Testing your application against new versions of such third party software is important. Community forums are generally very supportive but identifying and fixing problems is time consuming and goes against the desire for efficiency and effectiveness (more with less) that we are aiming for.
- Much benefit is to be gained by participating fully in open source communities. We have blogged about our experiences, have answered questions in community forums and have asked our own questions. In each case, responses have given us a better understanding of the technologies we have used. Don't be afraid to ask questions or blog your experiences. Even if you get some information wrong, community members will correct you and improve your understanding further. The feedback is valuable.
- With technologies like the Java EE stack which have been evolving for several years, it is important to be able to identify the 'current truth'. In other words, a lot of correct information on the Web refers to older versions of the same technology and so is no longer relevant. This becomes a problem in particular when first learning about a new technology. In trying to solve problems, searches can turn up solutions which work but which are out of date and hence not the most appropriate. We encountered this issue many times during the development of the External Examiners Web Application. At one point, we followed good but old guidance in the creation of the user interface, to create a nicely designed data transfer layer. Subsequently, using an up to date Java EE 6 approach, we made this layer redundant so we were able to remove it entirely and replace it with direct use of PDOs in JSF pages (as described above). Doing so left us with a smaller, cleaner codebase. The lesson from this is to try to find out how up to date any solution or guidance is before applying it.