Case Story Archives - Arrk Group https://www.arrkltd.co.uk/category/thought-leadership/case-story/ Software That Works Thu, 21 Nov 2024 05:28:09 +0000 en-GB hourly 1 Arrk’s Journey Into Migrating an Older Technology Platform https://www.arrkltd.co.uk/thought-leadership/case-story/arrks-journey-into-migrating-an-older-technology-platform/ Mon, 20 Nov 2023 14:42:43 +0000 https://www.arrkltd.co.uk/?p=25920 The post Arrk’s Journey Into Migrating an Older Technology Platform appeared first on Arrk Group.

]]>

Arrk's Journey Into Migrating an Older Technology Platform

By Arrk Group

4 mins read

We live in a world where digital connections and student lives are slowly intersecting. With online networking becoming the main focus of most university students, our client is one of the forerunners in this industry. Their brainchild, product, is one such platform that has taken the world over by storm. Students can get multiple benefits such as discounts, event details, and enough space to store their IDs virtually. But, with such a big responsibility comes the problem of storage.

So, let’s take a look at how we helped our client store data securely and at lower costs.

Who is the client?

Our client is a company that believes in creating communities with similar interests. Their focus is to create safe spaces for students. A space where they could get all the details about events happening around them or discounts at local stores.

Which is why this product was born.

The platform was made to help students connect with one another while getting the best offers around them.

This way, students could enjoy their university life at a limited cost.

This product, so far, has over 1 million happy users. And the numbers keep growing.

But they faced a challenge!

Considering the number of users of the platform, there are bound to be issues with data storage. This is what our clients face in real-time. Previously, all their information systems were hosted within their own office premises. But this meant that there were added risks.

Why?

Because they were using older technology and tightly coupled monoliths.

This was when they realized they needed to move to a cloud-based platform and restructure their entire storage process per the modern micro-servicing options.

What solution did we offer?

Firstly, we start engaging with the present client to understand how their existing app works and what they expect it to do. Next, we plotted the latest technological needs to turn PRODUCT into PRODUCT 2.0. It was set to have more detailed features than the older version and was to have more brand collaborations.

There were 8 stages through which the solution unfolded:

  1. Discovery phase: The team started exploring the existing system and scrutinized its technological and functional aspects.
  2. The strategy used: A mixed method was implemented, including a lifting and shifting approach, with re-platforming and restructuring. This approach was taken to make the migration of the existing platform to a more agile form.
  3. Technology roadmap: The team created a roadmap to highlight the stages of implementation. The primary focus was to reuse as many existing segments as possible.
  4. Creating user journeys: Our technical experts mapped out the entire user journey. It included how the web and app interfaces would interact and how a revamped backend would work together to offer a seamless experience.
  5. Microservice architecture: Our experts crafted fine-grained microservices that made the platform flexible and easily scalable for front-end interactions.
  6. Event-driven serverless architecture: We implemented an event-driven serverless architecture to reduce operational costs and maximize responsiveness.
  7. DevOps and automating the process: We embraced infrastructure as code, added DevOps practices, and automated the process to improve the overall stability and reliability of the process. This meant the deployment process was now faster, automated, and completely stress-free for our developers.
  8. Observations: We introduced a comprehensive monitoring and alerting system that helped proactively resolve any issue the platform faced.

The technology we used!

To craft this cost-effective and secure solution, we used multiple technologies and AWS services such as:

  1. AWS EC2, Containers
  2. AWS Lambda and Serverless
  3. AWS SQl and NoSql persistence
  4. AWS SQS and SNS for a decoupled architecture and AWS events
  5. AWS IAM, Organisations, Identity center
  6. AWS logs, metrics, and monitoring
  7. AWS Data Lake and Redshift data warehouse
  8. AWS Costs Explorer and budgets
  9. AWS Cloudformation (Infrastructure and automation) / Circle CI
  10. AWS DevOps tools
  11. Data Dog for observability in the full platform

What have we achieved?

Because of the implementation of our solution, the entire revamped application emerged as:

  1. A Cloud Native – The application now seamlessly harnesses the power of cloud-based principles.
  2. Microservices and AWS – Utilizing microservices and AWS, we maximized the platform’s potential.
  3. Decoupled and event-driven platform – We ensured that an event-driven and decoupled platform emerged that could adapt to the latest technological advancements.
  4. Scalable – The platform now can serve over 4.5 million users concurrently.
  5. Revenue Generation – The platform and its ecosystem can now generate a revenue of over 6 million GBP.
  6. Students love it – The new website has become one of the top most loved platforms among students in the UK.

Finally!

We at Arrk Group UK played a pivotal role with precision and expertise to help the client navigate the difficult task it had set out for itself. By taking time to understand the technological and functional differences, we provided a roadmap that could handle the surge of students looking to use it.

Our dedication to automation and DevOps practices helped remove any stress on our developers and brought in a culture of efficiency and reliability.

Ultimately, our collaboration shows that we help solve problems and unlock a new world of possibilities with the client.

The post Arrk’s Journey Into Migrating an Older Technology Platform appeared first on Arrk Group.

]]>
Journey of Technological Evolution – SharePoint 2013 to Online https://www.arrkltd.co.uk/thought-leadership/case-story/journey-of-technological-evolution-from-sharepoint-2013-to-sharepoint-online/ Mon, 20 Nov 2023 14:14:14 +0000 https://www.arrkltd.co.uk/?p=25905 The post Journey of Technological Evolution – SharePoint 2013 to Online appeared first on Arrk Group.

]]>

Journey of Technological Evolution - SharePoint 2013 to Online

By Arrk Group

4 mins read

In a world where innovation is one of the causes of growth, our client emerged as the leader in PEEK and PAEK-based polymer solutions. With their legacy extending over three decades, they have established themselves to thrive even in the most competitive markets. With a worldwide network of distribution centres, their business extends from mere manufacturing to technical support, market development, and design application, which is why they are considered the go-to partners across multiple sectors.

The Challenge of Migration

Imagine this – A thriving business within an evolving technical landscape that requires seamless collaboration.

The task we were assigned?

To migrate the old SharePoint 2013 website to a dynamic online one. However, transferring the data posed some difficulties. The real problem lies in migrating SharePoint designer workflows, which are crucial to their operations.

Even though the concept seems pretty straightforward, some intricacies cannot be avoided, such as:

  1. Obsolete workflow – The main backbone of the business – SharePoint 2013, is now becoming obsolete.
  2. Complex Workflow – The present workflow is quite intricate and turns routine maintenance into a tedious task.
  3. Compatibility issues – Differences between SharePoint Online and SharePoint 2013 left us with workflow gaps.
  4. Workflow issues – The existing workflow was riddled with problems that needed manual intervention to put things into place.

Our Tailor-made Solution

In the face of this challenge, we embarked on the journey to highlight the acceptance of innovation for our client. The cloud roadmap led them to a logical solution – the redevelopment of the workflow using Power Automate or Ms. Flows.

The journey we planned was:

  1. Mapping out the new idea – We consulted multiple business users to refine and finalize the approval process to align them with the current market trends
  2. Crack the code – To dissect the present SharePoint workflow and find the technical gaps
  3. Power of Automation – Arm ourselves with Power Automate to start the redevelopment process.

This approach was taken to provide speedy results. The benefits of which we saw as:

  1. Speedy evolution – Power Automate follows the ‘no or low code’ approach that helped speed up the development process
  2. Streamlined effort – With the effortless development process, integrating SharePoint with Outlook now became a breeze
  3. Embrace the cloud – The redeveloped workflow showcased its independent attitude, especially regarding on-premises environments. This aligned perfectly with our cloud-centric approach.

The Technology Behind The Workflow

Behind every triumphant client, we have a list of well-connected technologies. Such was the same for this client of ours. We used:

  1. Power Automate – The main attraction of our workflow – provided the process power and agility.
  2. SharePoint Online – The canvas which became the base of our process and provided us with innovation
  3. SharePoint CSOM/Rest API development – Supporting the technical intricacies where the magic took place
  4. Exchange Online – The seamless communications platform that helped make the gathering and flowing of information effortless

What does the future hold for our client?

Because of our efforts, migrating our clients’ details is now a technological marvel. It is a testament to the adaptable nature of our clients that shows how embracing something new can lead to excellent results.

With the previously riddled SharePoint 2013 website now sitting comfortably in SharePoint Online (powered by Power Automate) – our client can directly collaborate with anyone and everyone.

This shows that our client’s journey is not only about codes and bytes but about acceptance of innovation and growth.

Arrk Your Way To Glory!

Sometimes in your daily life, you may come to a crossroads where you must embrace technological advancements to grow your business. But there should always be a guiding force that will help support your transformation.

This guiding force is Arrk.

We do not only offer solutions; we offer partnerships. With our expertise spanning multiple industries and innovative technological landscapes, we help clients evolve with the global scenario.

If you are looking for a partner who will understand your challenges and tailor-make solutions for your unique problems, Choose Arrk Group Today!

The post Journey of Technological Evolution – SharePoint 2013 to Online appeared first on Arrk Group.

]]>
Converting Business Services to a Complete Serverless Architecture https://www.arrkltd.co.uk/thought-leadership/case-story/converting-business-services-to-a-complete-serverless-architecture/ Mon, 20 Nov 2023 13:49:15 +0000 https://www.arrkltd.co.uk/?p=25896 The post Converting Business Services to a Complete Serverless Architecture appeared first on Arrk Group.

]]>

Converting Business Services to a Complete Serverless Architecture

By Arrk Group

5 mins read

Most brands now look to give people an experience. That could be in the form of a service or a product. But, no matter what, the customer should walk away with a pleasant experience. They must remember the next time they want to buy a product; it must be from the same brand. In the same way, our client believes in promoting communities and societies for a common group.

One such product that our client has developed helps students manage their student life with details such as events, ID proof, and discounts – all at their fingertips. This product has now grown to over 1 million members and offers more features and benefits to its audiences.

But that is where the problem lies!

Our client was left with the problem of managing so many members. Also, multiple services they offered were deployed on a server – the maintenance of which became difficult. Again, hosting these services came with high costs, which reduced the profit margin.

So, what was the challenge?

We were challenged with understanding the current technology and optimising the costs incurred for the management of services by the client.

Let’s take a look at our client first!

Our client is an agency that believes in creating digital products and services that connect people with common interests to one another. They bring together the best talents to build engaging communities with common interests. One of their best services in the market is this product – a platform where college students can get the best money advice, offers, discounts, and more.

The platform is now facing problems due to an overload of servers!

Once we understood the existing services and the infrastructure used, we devised a solution plan to re-platform the current services offered on AWS serverless components.

Why did we look into AWS serverless components for them?

AWS provides technology for managing data, running code and integrating apps without managing servers. The serverless technology component has features such as automatic scaling and a pay-for-use billing model that helps optimise costs and improve agility.

Serverless applications must start with AWS Lambda, and it helps:

  1. Eliminate operational overheads so that teams can launch their product in the market faster
  2. A pay-for-value billing model that optimises automatically
  3. Automatically scale technology from zero to peak demands and adapt to customer requirements as fast as possible
  4. Build better applications faster and easier

So, how did we solve the problem?

The journey starts with a solution involved in strategic steps that solution architects orchestrate. To start with, a comprehensive analysis of existing codes is taken first. This includes a careful examination of the present architecture. Once the assessment was complete, the next phase of porting and platforming was started to modify the codes.

The back-end engineers were pivotal in the entire transformation process by leveraging their expertise and migrating the code to AWS Lambda Serverless components. This move redefined the architecture to enable the decoupling of functionalities into modular units that operate independently within the AWS serverless ecosystem.

Central to this was the integration of AWS Lambda, which functions as the entire brain of the operation. But this plays a dual role:

  1. Executing the business logic
  2. Interacting with AWS API gateways

This operation helps streamline communications between serverless components and the external environment, providing a seamless user experience.

The solution used AWS DevOps tools such as AWS Code Pipeline, CodeDeploy, and CodeBuild during the deployment. This helped speed up the launching of new products and features, minimising launch to the market timeline and bolstering the application of competitive edge.

One of the biggest advantages of the new architecture is the dynamic resource allocation. Unlike traditional server models with fixed resource allocation, back-end computation was executed based on the requirement. This strategy led to substantial cost savings, as charges were based on actual usage rather than the lease of the entire server.

What technology stacks did we use?

The technology stacks we used for this project are AWS Lambda, AWS API Gateway, Python, Boto3, AWS Code pipeline, Code Build and Code Deploy, and AWS CloudWatch logs.

What did our solution achieve?

Transforming the existing server-based components to AWS Serverless components saw major results. As the model changed from conventional payments to actual usage, there was a notable change in the costs involved. The change in costs aligned better with real-time resource consumption, resulting in a better cost-effective utilisation of the cloud.

One of the biggest outcomes our solution saw was a reduction of over 50% in AWS expenses but no reduction in functionality. This highlights the efficiency of the serverless approach and its positive impact on the bottom line.

Another benefit of AWS serverless components is the individual modification and deployment of specific components. This heightened the level of flexibility of the entire process. Each service can now be independent of the other, meaning responses to user demands and updates are faster. This takes a more adaptive approach that is required in the evolving market.

Final Thoughts

Adopting AWS Serverless components does provide tangible outcomes. This shift can be quite advantageous, especially with the real-time usage aspect. So, even though the components can be quite expensive, using them with a well-executed plan can, in the end, bring better returns.

On the other hand, the modular nature of the components is perfect for empowering businesses with the flexibility required for modifying and deploying specific aspects requirements of users.

So, our partnership was the perfect combination of a dream meant to help millions and the backing to provide a seamless application.

Want to gain in-depth insights into reducing your cloud expenses and improving the overall services you provide?

Connect with Arrk Group today!

The post Converting Business Services to a Complete Serverless Architecture appeared first on Arrk Group.

]]>
A Tale of AI-Powered Data Processing Revolution https://www.arrkltd.co.uk/thought-leadership/case-story/a-tale-of-ai-powered-data-processing-revolution/ Tue, 10 Oct 2023 16:14:34 +0000 https://www.arrkltd.co.uk/?p=25276 The post A Tale of AI-Powered Data Processing Revolution appeared first on Arrk Group.

]]>

A Tale of AI-Powered Data Processing Revolution

By Arrk Group

4 mins read

In construction, staying ahead of the game means getting detailed projects and reliable contacts. Our client is a trailblazing intelligence platform that has been one of the forerunners in providing top-notch construction marketing leads. The brand’s core revolves around delivering verified project data and GDPR-compliant contact lists.

But how do we achieve this?

The answer lies in the innovative approach to data automation and partnerships they forged with us. Armed with their construction expertise and our data management collaboration, the company mastered the art of building and curating quality construction leads.

So, what’s their secret?

Their secret lies in their website – a relentless commitment to sorting through countless data. This data goes through an entire process and is transferred to detailed insights.

This results in a website that caters to diverse customer requirements. So, to make this data processing smoother and faster for customers, the brand connected with us.

The Challenge at hand?

But here’s the problem – the mega volume of data pouring in from different sources was a huge challenge. Just imagine the costs and efforts required for offshoring this whole endeavour.

The main puzzle piece was collecting and putting the data into standardized structures. Especially considering that different websites provide data using different terminologies and classifications. Putting this data into standardized submissions was ‘Our Everest’.

Our Solution

We embarked on a quest to overcome this Challenge with an initial attempt at rule-based data mapping. But this was a tedious and painfully slow process. Tailor-made rules for every website gave us subpar results.

But enter Natural Language Processing (NLP) and Named Entity Recognition (NER). This dynamic duo had the potential to extract critical factors from the free-text fields. But no matter how accurate this approach was, it needed the pinpoint accuracy our client sought.

So, a hybrid solution came into being!

We married rule-based logic with machine learning. At first, our team identified the crucial data fields and rules for validation. Then these fields were populated using some AI magic.

And the result we got?

An AI-powered solution that achieved what only human minds could.

What was our process?

We knew it would not be a one-time thing when we embarked on this project. Instead, it was a series of small victories that extended over a few months. Our plan was simple:

  1. List all the fields required by the client
  2. Plan out how to transform the data for individual fields
  3. Provide estimations of implementation

Considering the speed requirement, we rolled out releases every couple of weeks. And with every release, the client was overjoyed to see more accurate data added. We have populated all the sections for 25% of the project and hope to complete 50% by the end of the year.

The technology we used to complete this feat

Under all our processes, there was a blending of cutting-edge technology. Our language of choice was Python, and the cloud domain was AWS-based. Two of the significant components that shaped the architecture of this project are – data processing and web crawling.

Web crawling had a combination of Scrapy and Python libraries, whereas data processing was powered by Python and included an arsenal of libraries and machine learning.

The future of AI

As the project continued, we could not help but wonder what lay ahead for our client. Having our eyes on OpenAI, we realized that the technology holds the key to revolutionizing data processing. With the ideology of generating fields, this technology could be the key to unlocking new endeavours in the data automation world.

Conclusion

The tale of our client’s magnificent journey into automation is one of collaboration, innovation and perseverance. With the fusion of AI-based technology, we at Arrk were able to tame the evil data beast and turn it into a treasure trove of construction data. As we finished this project, we could not help but wonder at the possibilities of data automation reaching newer heights.

We at Arrk have a penchant for jumping head-on into problem-solving and crafting tailor-made solutions. Join hands with us to unlock your data’s potential in this new age.

The data revolution is here! Let Arrk be your guide in the next step to innovation!

The post A Tale of AI-Powered Data Processing Revolution appeared first on Arrk Group.

]]>
Maximising Cost Optimization with AWS Best Practices https://www.arrkltd.co.uk/thought-leadership/case-story/maximising-cost-optimization-with-aws-best-practices/ Tue, 03 Oct 2023 16:15:07 +0000 https://www.arrkltd.co.uk/?p=25132 The post Maximising Cost Optimization with AWS Best Practices appeared first on Arrk Group.

]]>

Maximising Cost Optimization with AWS Best Practices

By Arrk Group

5 mins read

In today’s data-savvy world, businesses must rely heavily on cloud infrastructure. This dependence comes from the unmatched scalability and flexibility the cloud has to offer. The cloud’s ability to store, process, and analyse vast amounts of data enables companies to seamlessly adapt to changing demands. From swiftly making data-driven decisions to fostering innovation to providing a competitive advantage, all can be done by a cloud computing platform.

This has resulted in the evolution of cloud infrastructure into an indispensable asset for organisations seeking to thrive in the modern digital landscape.

Why?

To provide scalable and seamless services to their loyal customers, of course! One such cloud computing platform that has emerged on the top is Amazon Web Services (AWS). This prominent platform offers customers extensive services, including functionality, performance, and flexibility.

But at a high cost.

With organisations scrambling to outbid their competitors in the cloud game, organisations are pushing for expenses and cost optimisation checks, especially regarding AWS.

Our client, who is into construction intelligence, faced the same issue. They wanted to maintain the platform on AWS cloud but to optimise the amount spent in a structured manner. So, let’s look at how we helped our client reduce its overall expenses without compromising the quality of leads and data.

Let’s understand a bit about our client first before we discuss the problems and solutions.

Our client is a leading construction intelligence platform that has been providing customers with quality construction market leads and data for over 80 years. One of their primary functions was to deliver verified project details and GDPR-compliant contacts.

But, for them to achieve this, they must rely heavily on council websites.

This is because they collect, verify and refine data before sending it out to their customers via their portal. The management of data and projects is done in collaboration with our experts at Arrk, using multiple modules such as ML module, Platform, Planning Portal, Engines, and Harvest.

So, why did we look into Platform modules?

The idea of the Platform module is to serve as the backbone of the solution, and development is done using Python Flask (API) in the backend functionality and ReactJS for the frontend. Also, the primary databases used for storage and retrieval are Postgres and ElasticSearch.

Multiple engines, such as notification and export engines, also help the platform perform external tasks. This entire process is deployed on an AWS cloud that uses services such as Elasticache, EC2, Key Management, SQS, and more.

But, with great tasks comes greater costs!

Considering the number of programs used, the extent of data to be sorted through, and the platform’s overall security, it is no wonder why AWS is so expensive!

So, this was our task – using AWS but at optimised costs!

Challenge Accepted!

We at Arrk were challenged with optimising AWS’s cost structure by looking for ways to collect, refine and verify data collected on the platform. We also had to deliver strategies to help reduce the overall expenses without compromising the quality of market leads while steadfastly delivering the product to the client base.

And solve it we did!

How did we solve it?

Our expert team identified areas in the client’s AWS cloud platform to highlight where significant costs can be saved. With a thorough analysis of the utility of infrastructure, the patterns of usage, and the total breakdown of costs incurred, we could showcase options for optimising the platform to the client.

With both our collaborative efforts, we were able to leverage tools such as AWS CloudWatch logs and AWS Cost Explorer to gain valuable insights into optimising the cost structure and breaking down areas where spending can be minimised.

What did we find?

As per our analysis, the key areas for cost savings include:

  1. Right-sizing – Here, we were able to showcase to our client how to reduce the legacy application’s size and instances for lower environments.
  2. Optimising storage – This was achieved by reducing EFS storage size by changing the type from GP2 to GP3. Also, we changed S3 to Glacier for harvest documents and changed the S3 retainer period to 1 day.
  3. Auto-scaling – We implement auto-scaling for Web, API and also shut down of scripts for lower environments
  4. Containerisation and Serverless – Migrating engines did this to Fargate and Docker by migrating API and Web to Fargate-based ECS clusters.
  5. Reserving capacity and saving plans – We introduced AWS saving plans to help cover computation and ML services and reserved instances for Harvest and RDS.

Knowing what we know, it was now time to formulate strategies to support our analysis.

How did we plan to save costs?

For our ‘AWS Cost Optimisation project, we implemented a documentation and project management system.

How did we do this?

By using Confluence and Jira!

We used Confluence as a central documentation hub and maintained a dedicated space where all relevant details, plans, and progress reports were saved. These included details about cost-saving options and other technical information. This ensures that all team members have access to updated information and can collaborate with themselves easily.

Using Jira, we created a dedicated board to track and manage all the activities related to cost optimisation. The board follows a Kanban method, allowing users to visualise the entire process in one go or part by part. Each task has clear objectives and deadlines and can be assigned to the appropriate team members. This was done to support efficient task management and ensure the project moves smoothly.

We created a collaborative knowledge-sharing space and smooth workflow using these two platforms.

In summary, the client’s partnership with us was the ideal occasion to demonstrate how a thorough and rigorous strategy may deliver efficient solutions to optimise costs in the ever-changing world of cloud computing.

To get crucial insights into the overall breakdown of expenses and pinpoint areas where optimisation may result, our committed team collaborated closely with the client. We assisted them in delivering great market leads and information to their clients while upholding overall expenditure optimisation by developing tactics specifically tailored to the client.

So, if you are looking to reduce cloud expenses and improve the services you provide?

Join forces with us right now!

The post Maximising Cost Optimization with AWS Best Practices appeared first on Arrk Group.

]]>
Unveiling the Cloud Chronicles of a Polymer Giant https://www.arrkltd.co.uk/thought-leadership/case-story/unveiling-the-cloud-chronicles-of-a-polymer-giant/ Tue, 29 Aug 2023 16:24:24 +0000 https://www.arrkltd.co.uk/?p=25031 The post Unveiling the Cloud Chronicles of a Polymer Giant appeared first on Arrk Group.

]]>

Unveiling the Cloud Chronicles of a Polymer Giant

By Arrk Group

3 mins read

It all began around 35 years ago when the client started serving a diversified range of markets and proven solutions backed by unparalleled expertise with Arrk. Its products provide exceptional performance in the most demanding environments. But, they faced challenges that became harsh on the company.

With Arrk, our client has become the world leader in PEEK AN PAEK-based polymer solutions!

THE PROBLEM

Facing daily challenges in their operations and encountering setbacks in business growth and finances, the client was becoming increasingly anxious. Fortunately, Arrk was introduced. From the very beginning, Arrk started collaborating with both the technical and non-technical teams to understand the problem statement.

  1. Massive dependency on SharePoint On-Premises

The application relied on a SharePoint on-premises site, creating a dependency on the on-premises environment. Hence, hindering flexibility.

  1. Outdated technology

The use of classic ASP and .NET 2.0 technologies was introduced. Its limitations made maintenance and support difficult.

  1. Limited charting capabilities

The chart generation process utilized an outdated third-party library with limited features. It had a complex code, followed by troubleshooting challenges.

  1. Only Windows for data entry

Data entry was performed through a Windows Form application. It caused difficulties in maintenance and updates.

  1. Basic charts

The charts produced by the application had a basic visual appearance. They lacked the modern and visually appealing designs expected by users.

The project’s key objective was to modernize the old charting application to address the identified challenges and leverage cloud technologies for enhanced scalability, flexibility, and user experience.

THE SOLUTION

This is how Arrk got it all done. Having listened attentively, Arrk assembled its team of required software consultants, including both technical and non-technical members, to document the tasks and modules, and determine the priority of upgrades. Arrk identified the tools that required enhancement and pinpointed the necessary software and licenses for successful implementation, with the goal of achieving a solution.

  1. Reduced Dependency

The application migration from SharePoint on-premises to SharePoint Online eliminated the dependency on the on-premises environment and simplified maintenance.

  1. Enhanced Technology Stack

Upgrading the application to utilize Power Apps and Power BI introduced modern frameworks and tools, improving performance, maintainability, and supportability.

  1. Improved Charting Capabilities

Re-developing the charting functionality using Power BI provided rich. These charts were visually appealing with accurate data representation. Also, we eliminated reliance on the outdated third-party library.

  1. Streamlined Data Entry

The transition from a Windows Form application to a web-based solution using Power Apps removed the need for application redistribution and facilitated easier maintenance and updates.

  1. Enhanced Look and Feel

Using Power Apps and Power BI offered a rich look. Ensuring a modern and visually appealing user experience for charts and the application.

TECHNOLOGY USED

In this whole process, the following services were used by Arrk:

  1. Microsoft Azure DevOps was used to track the project’s progress as per the Agile process, which involved multiple sprints with show-and-tell sessions at the end of each sprint.

A technology stack was established for the revitalization of legacy charting applications.

  1. SharePoint Online migrated the application’s data from SharePoint on-premises to SharePoint Online, eliminating the dependency on on-premises environments.
  2. Power Apps were used for developing the application’s front end, providing low-code development capabilities and easy maintenance.
  3. Power BI was employed for re-developing the charting functionality, offering rich and visually appealing charts with accurate data representation.

It was a challenging project. But, Arrk’s attention to detail working style made it possible in no time. We serve technology to make business function better, one technology at a time!

The post Unveiling the Cloud Chronicles of a Polymer Giant appeared first on Arrk Group.

]]>