+1 vote
3.6k views
in Best Practices by (21.6k points)
edited by

In terms of the following categories

​​1. Usability Standards
2. Naming Standards
3. Code Review Checklist
4. Deployment Process
5. Coach Development Guidelines
6. Performance Considerations
7. Standardization of Look and Feel
8. Standardized use of javascript and css (style sheets)
9. Creation and Usage of Custom Controls
10. Lean Custom HTML Usage
11. Process Server Lookups
12. Validation Error Messages
13. Resource Bundling
14. Controls Rules
15. Artifact documentation requirements
16. Unit Testing Guidelines

1 Answer

+1 vote
by (21.6k points)
edited by
 
Best answer

1. Usability Standards

If there is a large amount of data, it is recommended to use a wizard style user interface wherein user can navigate back and forth between steps maintaining the previous entries they have made
User interface should be capable of handling large amount of data without performance degradation.
As per business needs audit trail of user actions may or may not be needed
Accessibility needs need to be specified if business need is there.
All screens should provide consistency in common screens and features e.g. common header/footer, common way of searching etc.
Feedback: System should provide appropriate feedback to users e.g. progress icons, on page error messages, dialog boxes with status
Bread Crumb/Process Progress Bar, for large processes with a lot of steps and states and statuses it is desirable to provide a progress bar to depict to the users where they are in the process flow.
seamless integration with external systems
Single Sign on
Multiple concurrent sessions may or may not be needed for same tasks screens and appropriate data integrity measures should be implemented, determine as per usage if optimistic locking or pessimistic locking is needed.
Security concerns authorization  of data at section or element level may need to be addressed with appropriate control/section  based authorizations.
Required fields should be clearly indicated with different color or with red asterisk.
 

2. Naming Standards

BPD naming convention - The primary or top level BPD should be clearly marked with some Pre-fix or Post-fix identifier e.g. Primary Process or Top Level, All BPDs are generally named with the format {Verb}{Noun} . e.g., Review Request, Approve Request, Submit Ad-hoc Request, Process Ad-hoc Request etc.
Service naming convention - {type} {verb}{noun} - Type should specify system or the type of the service. For UI services every service should start with UI.  Wording should be with capital letters and space between each word. For preposition (on, at, in. etc.) words use a small letter instead of capital letter. Examples: SYSTEMA Get Supplier Details, SYSTEMB Get Supplier Contact Details, SYSTEMC Submit Extension Request, SAP Create Purchase Order, BIZ Rule Map Purchasing Team to Time Zone, UI Submit Supplier Request etc.
Variable Types -
{prefix}{noun}. Prefix could be the Business Entity name or sytem name. Every noun will generically describe the several fields grouped in the Variable Type. It can consist of one word or more. Every word should start with capital letter and there is no spaces or underscore between words. Examples: ProjectRequest, ShoppingCart, PurchaseOrder, etc.
Variables Camel case - Javascript industry standard – can consist of one or more word . the 1st letter small case letter. Every words first letter is a capital letter (except of the 1st word). Examples: attestationRequest, businessContext, processContext etc.
Snapshot naming convention  {Project Name} {Release/Environment name} {Date}​ Future snapshots created on the same date can have a V1, V2 suffix to this format.

Code Review Checklist

1. Is there sufficient documentation of BPD and Service Artifacts is there color coding by type of tasks
2. Is Javascript indented and commented properly
3. Is reusable javascript function library created at a central server js file
4. Is appropriate usage of caching there
5. Is a Unit Test harness built for each task and integration service
6. Is sufficient Error Handling in place both at service and BPD level
7. What logging mechanism has been used and is sufficient logging there with appropriate logging levels set i.e. info, debug, error, alert
8. Is there direct access to Process DB (Product DB), it should be only when absolutely necessary since upgrades and migrations may not go through with Product DB changes
9. Is Performance DB being leveraged, has sizing of PDW database needs been done 
10. Is NOLOCK being used for SQL calls
11. Is string concatenation used or StringBuffer
12. what is unit tested load time in Development environment per task and integration
13. Is parametrized query for sql statements being used
14. Is every message on UI and BPD and task attributes connected to a central resource bundle so that they can be changed centrally
15. Is there hard coding of Environment specific variables e.g. hostname, port number, webservice endpoint etc. or they are in the environment configuration of the application
16. Is there a maintenance shutdown for the Process Application feature implemented
18. Are there validation errors in the Process Application
19. Is all the static data appropriately cached, and key based caching used for partially cacheable data
20. Are there default values set/accidentally left out for service input and output variables
21. Are all the static resources for the Process App e.g. css, javascript libraries etc minified and bundled in zip files. images should also be optimized for web delivery
22. Are Artifacts appropriately tagged with Company identifier and type of service
23. Are all the naming conventions adhered to
24. Are decision services being used for business decisions and routing where appropriate
25. If copies of existing controls are made they should be placed in a separate toolkit not mixing up with the Process Application code
26. Is the business data and process data appropriately bundled in context variables at BPD level so that minimal modifications to BPD is needed if additional data elements need to be added.

27. Is autotracking turned off in BPDs
28. Is save execution context checked on
29. Is there service exposure appropriately defined e.g. Admin Services vs Portal Services vs Dashboards
30. Is the exposure of services properly governed by teams and no accidental "All Users" exposures are there.
31. Is only minimal necessary Business Data exposed.
32. Are reusable artifacts e.g. Domain Business Objects, Generic Utility Dashboards etc. properly classified in toolkits.

Related questions

+1 vote
1 answer 1.7k views
0 votes
1 answer 1.6k views
0 votes
0 answers 269 views
0 votes
0 answers 1.2k views
asked Aug 22, 2023 by sanjay_hdi (360 points)
0 votes
0 answers 532 views
0 votes
0 answers 290 views

635 questions

495 answers

98 comments

2.9k users

Join BPM Community Discord Channel

Welcome to BPM Tips Q&A, Community wiki/forum where you can ask questions and receive answers from other IBM BPM experts and members of the community. Users with 2000 points will automatically be promoted to expert level.
Created by Dosvak LLC
Our Youtube Channel
...