Clients are increasingly demanding more from their financial institutions, constantly seeking to receive value from their banks, with a focus moving databases to the cloud losing security. Gaining the trust of users is increasingly difficult, but losing it is very easy. Therefore, the institutions always work to provide the best quality in their services in the most efficient way. The data has become extremely important for financial entities, growing in volume, speed and diversity of sources. For this reason, companies invest more resources in the care and collection of them, in order to provide better solutions. Taking the above into account, it is important to remain at the forefront, implementing tools that generate value for businesses so that they can differentiate themselves from their competition.
The design of applications for financial institutions has been modernized, changing over the years. For decades, all companies centered their applications and data on a mainframe. While this type of architecture responded well to the needs of the past, it has fallen short over time due to the scalability, agility and reliability challenges demanded by today’s industry. With this architecture all the applications and data are in a single computer system, and Moving databases to the cloud process completely different consuming and competing for existing resources, when it is clear that not all business services consume the same. In this way, if an error is detected in any application or data, it will have a collateral impact on the other applications, since they are sharing resources. This causes scalability to be affected by not being able to do it independently.
Over time, the client-server architecture that separated the logical layer and the persistence layer also appeared. However, it continued to inherit the challenges that came with the mainframe. Similarly, the three-layer architecture appeared where, although reliability, agility and scalability improved, it did not allow work teams to have a complete separation of all business services.
All of these challenges were completely mitigated by the emergence of the Moving databases to the cloud with microservices architecture. This allows each business service, or business service domain, to live in a different layer and in a different stack of the architecture. In this way, applications and data can be scaled independently, and evolve so that they can change over time, without having to impact other applications.
The architecture has evolved towards microservices; however, this has not been the case in the persistence layer where there are still challenges to scale and manage data, requiring downtime and complex manuals. These legacy databases are expensive, with procedural languages, proprietary, punitive licensing, and remain monolithic. Therefore, companies will end up with an architecture of hundreds or thousands of microservices and all pointing to a single repository, preventing companies or financial institutions from generating value.
Relational databases have been in the industry for decades, but business requirements have changed and so have access patterns. It has been shown that relational databases are not focused on solving a specific need in the best possible way, but on solving many needs in a good way.
Currently there are many companies that are unaware of new technologies, following traditional processes, since their financial teams are not able to explore tools that are useful to solve business needs. Thus, the metaphor of the “hammer” applies; that is, many times companies have been using a hammer for everything, to drive a nail, to fix doors, to loosen screws and in general for all their needs. For this reason, our job is to make them realize that there are more tools than a hammer and that these can better serve each specific need.
The first database is the value key, where single-digit millisecond latency is provided on a large scale. The table can have one record or trillions of records and the performance will be the same.
Over time, it has also been identified that the data is more interconnected and companies have become aware of those connections, taking an interest in analyzing them. This has allowed having a 360 ° view of customers, analyzing fraud cases based on the relationships that users have and using this information for marketing. Since this type of analysis is very expensive, graph databases are used. In this way, you can host related data and run different types of queries from those relationships. Customers like Nike use this type of database for their social networks on a scale of millions of connections and millions of nodes.
Likewise, over time, Internet of Things (IoT) applications have strongly pushed time databases. Time series data is information that is ordered chronologically. For this, it does not make sense to use a relational database to do this type of analysis since no guarantees are required. In other words, a time database is only interested in aggregations, it is not interested in updates, or integrity relationships. In this way, only a series of ordered events are handled, providing clients with the possibility of carrying out analyzes from a period of time as such.
In this way, there is a database for each type of need you have. This will not only ensure that your business service is better satisfied, but it will also improve the reliability and agility of your work services. However, carrying out the transformation procedure through databases can be complex and requires considerable effort. Therefore, there are some tips that all companies in the industry should follow.
First, direct your focus to activities that add value. By using managed services, you don’t need your teams to do administrative or maintenance activities. Thus, companies can free themselves from those low-value activities and focus on those that truly provide the most opportunities for the business and differentiate it from its competition. Likewise, take advantage of the power of an API (Application Programming Interface), since having an external API layer will allow you to have a pool of connections and manage their errors independently. In addition, you can make changes to databases without affecting the data that is being consumed.
In summary, moving databases to the cloud topic and architecture of the financial sector has evolved to have a complete separation of responsibilities, where each microservice has databases managed and adjusted to its needs. In this way, financial services companies manage to transform their businesses and innovate in the industry, with databases that give them greater dynamism and improve their response to the needs demanded by today’s world. Thus, scalability, reliability and agility go from being a challenge to become the best allies of these companies.
Hebert Gomez, Solutions Architect at AWS