Microservices Assessment Framework

Mohit is an experienced enterprise architect and blogger. He has consulted various organizations and trained multiple teams to enable them to successfully adopt and improve Microservices architecture.

Based on his experience,  Mohit is working on a Microservices assessment framework for following three objectives.

  1. Readiness – Access whether your organization is ready to adopt Microservices?
  2. Fitness – Access whether Microservices is good fit for your organization?
  3. Review– Evaluate your (Microservices) Architecture and identity area of improvements.



Proposed framework access Organization, processes and  base architecture.  You may find various questionnaires accessing following items.

  1. Business Drivers – Determine whether you have clear and valid business drivers for MSA.
  2. Development Velocity – Determine whether you can take benefits from MSA.
  3. Base Architecture – Determine whatever base architeure has all the required
  4. Infrastructure – Determine whether your organization has developer and MSA friendly infrastructure.
  5. Organization Structure – Determine whether you have correct organization structure required for MSA
  6. Processes – Determine whether you have correct organization processes required for MSA
  7. Individual Service Design – Determine capability of each service design.

Stay tuned for More Information. Please contact us for learning more.



Why Is Swagger JSON Better Than Swagger Java Client?

The Swagger Java-Based Client Using Java Annotations on the Controller layer

Pros and Cons

  • It’s the old way of creating web-based REST API documents through the Swagger Java library.
  • It’s easy for Java developers to code.
  • All API description of endpoints will be added in the Java annotations parameters.
  • Swagger API dependency has to be added to the Maven configuration file POM.xml.
  • It creates overhead on the performance because of extra processing time for creating Swagger GUI files (CSS, HTML, JS etc). Also, parsing the annotation logic on the controller classes creates overhead on the performance, as well. It makes the build a little heavy to deploy on microservices, where build size should be smaller.
  • The code looks dirty because the extra code has to be added to the Spring MVC Controller classes through the Spring annotations. Sometimes, if the description of the API contract is too long, then it makes code unreadable and maintainable.
  • Any change in an API contract requires Java to build changes and re-deployment, even if it’s only simple text changes, like API definition text.
  • The biggest challenge is to share with the clients/QA/BA teams before the actual development and to make frequent amendments. The service consumers may change their requirements frequently. Then, it’s very difficult to make these changes in code and create the Swagger GUI HTML pages by redeploying and sharing the updated Swagger dashboard on the actual deployed dev/QA env.

2. Swagger JSON File Can be Written Separately and Provide Browser-Based GUI

Pros and Cons

  • In this latest approach, all of the above challenges with Java-based client solution have been solved.
  • The developer initially creates a JSON file, shares, and agrees with the service consumer and stakeholders. They will get signed off after many amendments —no code change and re-deployment are required.
  • The code will be cleaner, readable, and maintainable.
  • There is no extra overhead for file creation and processing, performance is better, and the code is more lightweight for microservices, etc.
  • There is no code dependency for any API contract changes.
  • Swagger JSON file resides in the project binaries (inside src/main/resources/swagger_api_doc.json). We can deploy Swagger on one server and can switch to an environment like this.


You can copy and paste swagger_api_doc.json JSON file content on https://editor.swagger.io/. It will help you modify content and create an HTML page like the following.  Swagger GUI will provide the web-based interface like Postman.

10 Challenges of Microservices and Solutions – Tips & Tricks

I am cloud API developer and architect and currently working on Google’s GCP based microservices for a large retail client of USA.

Transitioning/implementing to microservices creates significant challenges for organizations. I have identified these challenges and solution based on my real exposure of microservices on PROD.

I am writing this white paper in June 2018. At this time, Microservices architecture is not matured enough to address completely all the existing challenges, however open source communities and IT product companies are trying to address all these open issues. All new researches on this topics are based on the finding solutions to the new challenges.

These are the major 10 challenges of microservices architecture and proposed solutions-

1. Data Synchronization – We have event sourcing architecture to address this issue using the async messaging platform. Saga design pattern can address this challenge.
2. Security – API Gateway can solve these challenges. Kong is very popular open source which is being used by many companies on the production system. The custom solution can also be developed for API security using JWT token, Spring Security, and Netflix Zuul/ Zuul2. There are enterprise solutions are also available like Apigee, Okta ( 2 step authentication). Openshift for public cloud security for its top features of RedHat Linux Kernal based security and namespace-based app to app security.
3. Versioning – It will be taken care by API registry and discovery APIs using dynamic Swagger API, which can be updated dynamically and shared with the consumers on the server.
4. Discovery – It will be addressed by API discovery tools like Kubernetes, OpenShift. It can also be done using Netflix Eureka at the code level. However, doing with the orchestration layer will be better and that can be managed by these tools rather doing and maintaining thru code and configuration.

5. Data Staleness – The database should be always updated to give recent data, API will fetch data from the recent and updated database. A timestamp entry can also be added with each record in the database to check and verify the recent data. Caching can be used and customized with an acceptable eviction policy based on the business requirement.
6. Debugging and Logging – There are multiple solutions- Externalised logging can be used by pushing log messages to an async messaging platform like Kafka, Google PubSub etc. A correlation ID can be provided by the client in the header to REST APIs to track the relevant logs across all the pods/docker containers. Also, local debugging can be done individually of each microservice using IDE or checking the logs.
7. Testing– This issue can be addressed using unit testing by mocking REST APIs, Mocking integrated/dependent APIs which are not available for testing using WireMock, BDD Cucumber integration testing, performance testing using Jmeter and any good profiling tool like Jprofiler, DynaTrace, YourToolKit, VisualVM etc.
8. Monitoring – Monitoring can be done using Open source tools Prometheus in the combination of Grafana by creating Gauge and matrices, Kubernetes/OpensShift, Influx DB, Apigee, the combination of Graphana and Graphite.
9. DevOps Support – Microservices deployment and support related challenges can be addressed using state on the art DevOps tools GCP Kubernetes, OpenShift with Jenkin.
10. Fault Tolerance – Netflix Hystrix can be used to break the circuit if there is no response from the API for the given SLA/ETA.

Blockchain in Insurance Claims

via Blockchain

Blockchain is a distributed ledger initially used by Bitcoin cryptocurrency and eventually by many banking organizations to record transactions between parties with high security. It is start of Blockchain arena and it is anticipated to have a long sustainability and acceptability in various industries.

One of the biggest use cases in Insurance Industry is adoption of blockchain in claim processing. Insurance contracts involve various parties as agents, brokers, repair shops and third party administrators involving manual work and duplication at various stages of value chain. Using blockchain, verification of transactions will be done without any human intervention and making it completely automated process at various stages.

Benefits for using blockchain in claim processing –

  1. The distributed ledger allows various parties to update the information securely like updating claim forms, evidence, police report etc helping in reduction of loss adjusted (LAE) expenses.
  2. Fraud Detection – As blockchain maintains a ledger of multiple parties, it has ability to eliminate any errors and frauds. Blockchain technology using the high computing power authenticates the customers, policies and transactions.
  3. Payments – Claim payments can be made without any need of intermediary authority for transaction validations which helps in reducing the overall operational cost of claims processing.
  4. As this is highly secured transactions, multi review process will be eliminated resulting into speedy claim processing.