Tag Archives: Software Architecture

Designing an AWS-Based Notification System

To build an effective notification system, it’s essential to understand the components and flow of each notification service.

iOS Push Notifications with AWS

  • Provider: Host your backend on Amazon EC2 instances.
  • APNS Integration: Use Amazon SNS (Simple Notification Service) to interface with APNS.

Android Push Notifications with AWS

  • Provider: Deploy your backend on AWS Elastic Beanstalk or Lambda.
  • FCM Integration: Connect your backend to FCM through HTTP requests.

SMS Messages with AWS

  • Provider: Integrate your system with AWS Lambda.
  • SMS Gateway: AWS Pinpoint can be used as an SMS gateway for delivery.

Email Notifications with AWS

  • Provider: Leverage Amazon SES for sending emails.
  • Email Service: Utilize Amazon SES’s built-in email templates.

System Components

User: Represents end-users interacting with the system through mobile applications or email clients. User onboarding takes place during app installation or new signups.

ELB (Public): Amazon Elastic Load Balancer (ELB) serves as the entry point to the system, distributing incoming requests to the appropriate components. It ensures high availability and scalability.

API Gateway: Amazon API Gateway manages and exposes APIs to the external world. It securely handles API requests and forwards them to the Notification Service.

NotificationService (AWS Lambda — Services1..N): Implemented using AWS Lambda, this central component processes incoming notifications, orchestrates the delivery flow and communicates with other services. It’s designed to scale automatically with demand.

Amazon DynamoDB: DynamoDB stores notification content data in JSON format. This helps prevent data loss and enables efficient querying and retrieval of notification history.

Amazon RDS: Amazon Relational Database Service (RDS) stores contact information securely. It’s used to manage user data, enhancing the personalized delivery of notifications.

Amazon ElastiCache: Amazon ElastiCache provides an in-memory caching layer, improving system responsiveness by storing frequently accessed notifications.

Amazon SQS: Amazon Simple Queue Service (SQS) manages notification queues, including iOS, Android, SMS, and email. It ensures efficient distribution and processing.

Worker Servers (Amazon EC2 Auto Scaling): Auto-scaling Amazon EC2 instances act as workers responsible for processing notifications, handling retries, and interacting with third-party services.

Third-Party Services: These services, such as APNs, FCM, SMS Gateways, and Amazon SES (Simple Email Service), deliver notifications to end-user devices or email clients.

S3 (Amazon Simple Storage Service): Amazon S3 is used for storing system logs, facilitating auditing, monitoring, and debugging.

Design Considerations:

Scalability: The system is designed to scale horizontally and vertically to accommodate increasing user loads and notification volumes. AWS Lambda, EC2 Auto Scaling, and API Gateway handle dynamic scaling efficiently.

Data Persistence: Critical data, including contact information and notification content, is stored persistently in Amazon RDS and DynamoDB to prevent data loss.

High Availability: Multiple availability zones and fault-tolerant architecture enhance system availability and fault tolerance. ELB and Auto Scaling further contribute to high availability.

Redundancy: Redundancy in components and services ensures continuous operation even during failures. For example, multiple Worker Servers and Third-Party Services guarantee reliable notification delivery.

Security: AWS Identity and Access Management (IAM) and encryption mechanisms are employed to ensure data security and access control.

Performance: ElastiCache and caching mechanisms optimize system performance, reducing latency and enhancing user experience.

Cost Optimization: The pay-as-you-go model of AWS allows cost optimization by scaling resources based on actual usage, reducing infrastructure costs during idle periods.

Stackademic

Thank you for reading until the end. Before you go:

  • Please consider clapping and following the writer! 👏
  • Follow us on Twitter(X), LinkedIn, and YouTube.
  • Visit Stackademic.com to find out more about how we are democratizing free programming education around the world.

Angular &Microfrontends: Toy Blocks to Web Blocks

When I was a child, my playtime revolved around building vibrant cities with my toy blocks. I would carefully piece them together, ensuring each block had its own space and significance. As a seasoned architect with over two decades of industry experience, I’ve transitioned from tangible to digital blocks. The essence remains unchanged: creating structured and efficient designs.

Microfrontends:

Much like the city sectors of my childhood imaginations, microfrontends offer modularity, allowing different parts of a web application to evolve independently yet harmoniously. Angular’s intrinsic modular nature seamlessly aligns with this. This modular structure can be imagined as various sectors or boroughs of a digital city, each having its unique essence yet forming part of the larger metropolis.

AG Grid:

In my toy block city, streets and avenues ensured connectivity. AG Grid performs a similar function in our digital city, giving structure and clarity to vast amounts of data. With Angular, integrating AG Grid feels as natural as laying down roads on a plain.

<ag-grid-angular
style="width: 100%; height: 500px;"
class="ag-theme-alpine"
[rowData]="myData"
[columnDefs]="myColumns">
</ag-grid-angular>

These grids act as pathways, guiding the user through the information landscape.

Web Components and Angular Elements:

In the heart of my miniature city, unique buildings stood tall, each with its distinct architecture. Web components in our digital city reflect this individuality. They encapsulate functionality and can be reused across applications, making them the skyscrapers of our application. With Angular Elements, creating these standalone skyscrapers becomes a breeze.

import { createCustomElement } from '@angular/elements'

@NgModule({
  entryComponents: [DashboardComponent]
})
export class DashboardModule {
  constructor(injector: Injector) {
    const customElement = createCustomElement(DashboardComponent, { injector });
    customElements.define('my-dashboard', customElement);
  }
};

Webpack and Infrastructure:

Beneath my toy city lay an imaginary network of tunnels and infrastructure. Similarly, Webpack operates behind the scenes in our digital realm, ensuring our Angular applications are optimized and efficiently bundled.

const AngularWebpackPlugin = require('@ngtools/webpack')

module.exports = {
  // ...
  module: {
    rules: [
      {
        test: /(?:\.ngfactory\.js|\.ngstyle\.js|\.ts)$/,
        loader: '@ngtools/webpack'
      }
    ]
  },
  plugins: [
    new AngularWebpackPlugin()
  ]
};;

Manfred Steyer:

In every narrative, there’s an inspiration. For me, that beacon has been Manfred Steyer. His contributions to the Angular community have been invaluable. His insights into microfrontends and architecture greatly inspired my journey. Manfred’s eBook (https://www.angulararchitects.io/en/book/) is a must-read for those yearning to deepen their understanding.

From the joys of childhood toy blocks to the complex software architectures today, the essence of creation is unchanging. Tools like Module Federation, Angular, Webpack, AG-Grid, and WebComponents, combined with foundational structures like the Shell, empower us not just to build but to envision and innovate.

Stackademic

Thank you for reading until the end. Before you go:

  • Please consider clapping and following the writer! 👏
  • Follow us on Twitter(X), LinkedIn, and YouTube.
  • Visit Stackademic.com to find out more about how we are democratizing free programming education around the world.

API-First Software Development: A Paradigm Shift for Modern Organizations

In the fast-paced world of software development, organizations are constantly seeking innovative approaches to enhance their agility, scalability, and interoperability. One such approach that has gained significant attention is API-first software development. Recently, I stumbled upon an enlightening article by Joyce Lin titled “API-First Software Development for Modern Organizations,” it struck a chord with my perception of this transformative methodology.

API-first development prioritizes APIs in software design to create strong interconnected systems. It’s a game-changer for modern organizations and Lin explains the principles well.

The concept of separation of concerns particularly resonated with me. By decoupling backend services and frontend/client applications, API-first development enables teams to work independently and in parallel. This separation liberates developers to focus on their specific areas of expertise, allowing for faster development cycles and empowering collaboration across teams. The API acts as the bridge, the bond that seamlessly connects these disparate components into a cohesive whole.

Moreover, Lin emphasizes the scalability and reusability inherent in API-first development. APIs inherently promote modularity, providing clear boundaries and well-defined contracts. This modularity not only facilitates code reuse within a project but also fosters reusability across different projects or even beyond organizational boundaries. It’s a concept that aligns perfectly with my belief in the power of building on solid foundations and maximizing efficiency through code reuse.

Another crucial aspect Lin highlights is the flexibility and innovation that API-first development brings to the table. By designing APIs as the primary concern, organizations open the doors to experimentation, enabling teams to explore new technologies, frameworks, and languages on either side of the API spectrum. This adaptability empowers modern organizations to stay at the forefront of technological advancements and fuel their drive for continuous innovation.

After reading Lin’s article, I firmly believe that API-first development is not just a passing trend but a revolutionary approach that unleashes the full potential of modern organizations. The importance of API-first design, teamwork, flexibility, and compatibility aligns with my personal experiences and goals. This methodology drives organizations towards increased agility, scalability, and efficiency, empowering them to succeed in the constantly changing digital world.

Thank you, Joyce Lin, for your insightful article on API-First Software Development for Modern Organizations.

Stackademic

Thank you for reading until the end. Before you go:

  • Please consider clapping and following the writer! 👏
  • Follow us on Twitter(X), LinkedIn, and YouTube.
  • Visit Stackademic.com to find out more about how we are democratizing free programming education around the world.

Managing Tech Debt: Balancing Speed & Quality

When faced with the discovery of technical debt within a team, there are three possible approaches to consider:

To effectively manage technical debt, it is crucial to strike a balance between speed and quality. This involves allocating sufficient time for proper planning, design, and testing of software, ensuring its long-term maintainability and scalability.

If you’d like to explore this topic further, the following resources can provide more insights:

Overcoming Limitations: Creating Custom Views in Trino Connectors (JDBC) without Native Support

During a feasibility test using distributed SQL (Trino/Starburst) for handling large volume Adhoc SQL queries, a common challenge arose. Trino, an open-source distributed SQL query engine, supports various connectors for interacting with different data sources. However, we discovered that creating views/tables on Oracle-based connectors was not directly supported by Trino. In this article, we will explore a solution to overcome this limitation by leveraging a dummy connector to create custom views in Trino.

Solution Steps:

  • Create a Dummy Connector:

To enable the creation of custom views in Trino, we need to set up a dummy connector. This connector will serve as a catalog for storing the custom views and tables.

Create a new file named dummy.properties and add the following content:

connector.name=memory
  • Restart Trino:

Restart the Trino server to apply the configuration changes and make the dummy connector available.

  • Verify and Select the Catalog:

Check the available catalogs using the following command:

trino> show catalogs;
atalog
---------
dummy
jmx
memory
oracle
system
tpcds
tpch
(7 rows)
trino> use dummy.default;
USE
  • Create Custom Views:

Now that the dummy connector is set up and selected, we can create custom views using SQL statements. Let’s assume we want to create custom views based on tables from the oracle.hr schema. Note oracle is the connector for the Oracle database in this example.

-- Create custom view
CREATE VIEW cust_emp_v AS SELECT * FROM oracle.hr.emp;
CREATE VIEW cust_dept_v AS SELECT * FROM oracle.hr.dept;

This solution enables us to perform complex analytics and join data from multiple connectors seamlessly, creating tables/views in Trino. By sharing this article, I aim to assist others who may face similar challenges when working with Trino and Oracle databases.

The Rise of Analytical Engineering: Embracing a Data-Driven Future

I wanted to share my thoughts on an exciting trend that I believe is reshaping the data landscape: analytical engineering. As someone who has personally experienced this shift, I can confidently say that it holds immense potential and opens up thrilling opportunities.

Analytical engineering is at the forefront of the data analytics field, bridging the gap between traditional data engineering and advanced analytics. By combining the best of both worlds, it empowers organizations to uncover deeper insights and make informed, data-driven decisions.

What truly sets analytical engineering apart is its ability to connect data teams with business stakeholders. No longer confined to isolated data operations, analytical engineers actively participate in strategic discussions, contribute to shaping priorities, and align data initiatives with business objectives. This collaboration is a game-changer, driving tangible value and fueling business growth.

At the core of analytical engineering lies the power of SQL and data modelling. These skills enable analytical engineers to transform and analyze data, creating robust data models that generate accurate and actionable insights. By leveraging modern data stack tools like DBT, analytical engineers streamline the data pipeline, ensuring seamless data ingestion, transformation, and scheduling.

Another critical aspect of analytical engineering is the empowerment of self-service analytics. By providing intuitive tools and platforms, analytical engineers enable business users to explore and analyze data independently. This democratization of data fosters a culture of data-driven decision-making, empowering individuals at all levels to unlock valuable insights without relying solely on technical teams.

The demand for analytical engineering skills is skyrocketing as businesses increasingly recognize the competitive advantage of advanced analytics. Roles like analytics engineer offer professionals a unique opportunity to leverage their technical expertise while driving impactful business outcomes. It’s an exciting time to be part of this field, with competitive salaries and ample room for career growth.

As an Enterprise Solution Architect, I have personally witnessed the transformative power of analytical engineering. It is an exciting career path that merges technical excellence with business acumen, enabling professionals to shape priorities, drive innovation, and significantly impact organizational success. While analytical engineering takes the spotlight, it is important to acknowledge the continued importance of data engineering, as the two disciplines complement each other.

Stackademic

Thank you for reading until the end. Before you go:

  • Please consider clapping and following the writer! 👏
  • Follow us on Twitter(X), LinkedIn, and YouTube.
  • Visit Stackademic.com to find out more about how we are democratizing free programming education around the world.

The Symphony of ViewModel Composition: A Tale from Chennai

In the bustling city of Chennai, India, amidst the vibrant financial landscape, two brilliant software engineers, Durai Kasinathan and Amar Samant, embarked on a remarkable journey to bring harmony to the world of bank reconciliation. Their vision was to create a sophisticated Bank Reconciliation Application that would conduct a symphony of financial balance by harmonizing four essential microservices, each dedicated to a specific business domain.

The Visionaries: Durai Kasinathan and Amar Samant

Durai Kasinathan and Amar Samant, close friends and fellow software engineers, shared a passion for unravelling complexities and creating efficient solutions. Inspired by Chennai’s rich cultural heritage and the rhythmic beats of Carnatic music, they named their masterpiece “The Symphony of ViewModel Composition.”

The Harmonious Ensemble: Four Reconciliation Microservices

  1. Regulatory Raga (Microservice-1): Conducting regulatory compliance, ensuring that the financial transactions dance in accordance with the laws and guidelines.
  2. Lending Melody (Microservice-2): Orchestrating the world of loans and debt reconciliation, where financial chords are played with finesse.
  3. Operational Sangeet (Microservice-3): Maintaining the operational rhythm of the financial institution, ensuring seamless workflows.
  4. Market Rhythm (Microservice-4): Composing the symphony of market insights, guiding investment decisions with melodious analytics.

The Crescendo of ViewModel Composition

Driven by their vision, Durai and Amar delved into the art of ViewModel Composition. They divided the complex bank reconciliation process into smaller, harmonious parts represented by distinct ViewModels for each microservice.

The Ensemble Comes Alive: The Bank Reconciliation Application

As the requests flowed into the Reconciliation Application, the Catalog Handler skillfully matched each one to the appropriate microservice. The Composition Gateway conducted the ViewModel Composition, orchestrating a grand symphony of financial reconciliation.

A Resounding Success: Financial Harmony in Chennai

The Symphony of Financial Reconciliation became a resounding success, bringing efficiency and balance to the financial landscape of Chennai. Durai Kasinathan and Amar Samant’s vision of harmonizing diverse microservices through ViewModel Composition became the beacon for financial institutions across the city.

In the heart of Chennai, the Symphony of Financial Reconciliation continues to resonate, uniting the diverse business domains in a mesmerizing financial symphony, thanks to the brilliance of Durai Kasinathan and Amar Samant.

And thus, the legacy of the Symphony of Financial Reconciliation lives on, a timeless tale of innovation and collaboration in the ever-evolving world of finance.

Please note that for the sake of storytelling and to maintain a relatable context, I have used the names of my best friends as characters: “Durai Kasinathan” and “Amar Samant,” who represent the brilliant software engineers in my narrative.

Embarking on a Journey with Micro Frontends: A Keralaite Software Architect’s Tale

I am Shanoj, a passionate software architect hailing from the enchanting land of Kerala, India. In this article, I am thrilled to take you on a captivating journey of discovery and innovation as I explore the fascinating world of micro frontends. Join me as I unravel the transformation of our software development approach, much like the serene backwaters of Kerala flowing into uncharted territories.

The Monolithic Conundrum:

As I stood amidst the breathtaking scenery of Kerala, I couldn’t help but notice the striking similarity to our software development challenges. Our team was working on a massive monolithic frontend application that mirrored the tangled maze of the region’s lush forests. Maintenance was cumbersome, and collaboration among teams seemed like a distant dream.

Embracing Microservices in the Backends:

Inspiration struck as I witnessed the seamless harmony of Kerala’s backwaters. Microservices were transforming the backend development landscape, offering modular and autonomous solutions. I couldn’t help but wonder, why not apply the same principles to our front-end development?

Discovering Micro Frontends:

Eager to explore this new frontier, I dived into the realm of micro frontends, akin to an explorer charting uncharted waters. It was then that I stumbled upon the enchanting magic of Webpack’s Module Federation. Like the spices that give Kerala cuisine its unique flavor, Webpack added a delightful twist to our frontend architecture.

Designing the Shell Application:

Just as Kerala’s traditional houseboats, locally known as “Kettuvallams,” navigate through the backwaters, we set out to build our shell application. This application would act as the captain, orchestrating and composing the smaller modules or micro frontends.

Setting Sail with Micro Frontends:

Each micro frontend was a manifestation of a unique feature, much like the colourful Kathakali dance masks of Kerala. With Angular Architect’s Web Module Federation, we effortlessly exposed and consumed these micro frontends, each residing on its own independent server.

Dynamic Loading in Full Swing:

The vibrant rhythm of Kerala’s snake boat races inspired us to optimize our application’s performance with dynamic loading. Fetching micro frontends only when needed reduced the initial load time, creating a faster and more delightful user experience.

Scaling New Heights:

Our micro frontend architecture breathed new life into our development process, much like the refreshing monsoon rains of Kerala. This modular approach empowered teams to work independently, scale specific features as required, and deploy with unwavering confidence.

As the sun sets over the horizon of Kerala, my journey with micro frontends comes to a close, leaving behind a legacy of innovation and transformation. Like the ever-evolving culture of Kerala, our software journey will continue to progress, embracing the wonders of micro frontends and unlocking new possibilities.

So, dear readers, as you traverse the lush greenery of Kerala in your thoughts, take a moment to contemplate how micro frontends can revolutionize your own software development voyage. Embrace the change, and together, let’s navigate the boundless waters of innovation and growth!