Our commitment to ongoing security and trust for our clients remains a continual high priority for us.

As part of this dedication to instilling the greatest level of trust with our clients, we recently took part in the UK Cyber Essentials Plus Scheme.   

What is the Cyber Essentials Scheme?  

The Cyber Essentials Scheme is a UK government-backed certification to help organisations protect themselves against common cyber threats. The certification comes in 2 categories: Essential and Essential Plus, the latter being a more comprehensive set of technical controls with a hands-on technical audit.  

The aim of the certification is to give organisations of any size the ability to demonstrate they have the necessary technical controls in place to be cyber secure. Secure against threats such as:  

  • Malware  
  • Phishing  
  • Man-in-the-middle threats  
  • Denial-of-services attacks  

Why is Cyber Essentials important?  

At Transparity we’re committed to being a Trusted Partner to all of our customers, from trust in the services we deliver, trust in the systems underpinning Transparity, and trust in our team of specialists that make it possible. Cyber threats may target an individual organisation, but the impact can often be widespread.   

For that reason, we decided to undergo the process of becoming Cyber Essentials Plus certified. This demonstrates to all of our partners that we continuously take cyber security seriously – giving our partners peace of mind that we have the tools and measures in place to be cyber secure.  

What does this mean for Transparity and our customers?  

It’s easy to take cyber security for granted and to fall into the trap that common sense will prevail. With every aspect of working life becoming digital, and cyber attacks growing in sophistication, the likelihood of falling foul increases exponentially. Being able to demonstrate to both employees and customers that cyber security is a subject that we take seriously is invaluable helps reinforce the fact that we are a trusted partner in all aspects.   

Being awarded Cyber Security Essentials Plus is not the end of the journey for us, cyber security, and the threat it poses to Transparity and our partners is a continual focus. 

Redis Cache is a well know caching technology and you can run it in Azure as a fully managed service. Being a fully managed service, you do not need to worry about provisioning the underlying infrastructure. If you have experience with on-premises cache servers you will know how difficult it is to build a 3-node load-balanced cache or worry about patching it. So as a Managed Service it leaves you to concentrate on building solutions that require a caching service.

There are many patterns out there where a cache is important. These may include cache-Aside, write-behind, write-through, read-through and refresh ahead. Whichever pattern you decide, Azure is a great place to build out the services.

AZURE REDIS CACHE TIERS

There are many tiers to select from, Azure gives you a great choice.

Azure Redis Cache Tiers

If you need geo-replication capability then basic and standard are not high enough for you. Naturally, as you go up the tiers, the higher the cost.

Many enterprises will opt for premium onwards where the combination of horizontal scaling, private link, data persistence and geo-replication are key requirements for projects. The underlying architecture is complex and fully built by Microsoft and looks like the below for premium tiers onwards.Azure Redic Cache Architecture

When using the premium tier, you can actually choose between 1-3 Availability zones, many will opt for 3 zones which does make sense for maximum resiliency.

BUILDING A REDIS CACHE

Within the Azure Portal, find Azure Cache for Redis.

Azure Cache for Redis

Fill in the basics (the location should be where your app is):Fill in the basics

Click on the pricing details hyperlink. This is where we can select the tier you want, which we discussed in the earlier part of the blog.

Select the tiers

You can see quite a difference in pricing from 53GB cache to 120GB cache. Important note, the basic tier should never be used for production.

From a networking perspective, you will likely opt for a private link. The ability to connect with a private IP for backend applications is important. Networking

The next section is about the build.Redis version

SystemLet’s discuss this section. If you’re using the premium tier, this means you have the ability to increase shard counts. The above image shows 2 shard counts meaning 12GB cache available, this will increase costs. Obviously, the main benefit is having more GBs but it also gives you the ability to continue operations when a node is experiencing failures or is unable to communicate with the rest of the cluster allowing for more throughput capability. You can have up to 10 shards in the cluster.

Also, notice for more availability, I decide to use 3 availability zones, you can actually select 1 or 2. If you opt for a persistent storage strategy, then you need Azure Storage and ideally, you will want to access that without keys but via Managed Identity. Managed Identities are an extremely powerful concept where we no longer need to manage credentials between connecting cloud services and with no extra costs.

DATA PERSISTENCE

Why would you want to persist data? If a complete hardware failure occurs and all caches are offline, we use the persistent storage to reconstruct it from the most recent snapshot, this is called RDB persistence, there is something else called AOF where all writes are saved to a log and rebuilt from that. Check out this document for an in-depth analysis of the pros and cons: https://redis.io/topics/persistence#aof-advantages.

The data persistence screen is shown below.Data Persistence

Some important points regarding data persistence:

  1. You can’t enable both types, only one at a given time.
  2. Don’t forget the logged data within the storage account has associated costs assigned to it.
  3. If you are dealing with high throughput data then you will probably want your storage account to be premium tier.
  4. RDB files are backed up to storage are stored in the form of page blobs.

Now we are ready to deploy and create the cache.

Deployment in progress

You can build many different applications for your Azure Redis Cache service which includes .NET, ASP.NET, Java, Node JS, Python and even Rust apps. The prerequisites to build these apps are that you need to get the hostname, port number and access keys which you can find within the Azure portal under the settings section.

Access Keys

Please see the following example from Microsoft documentation: https://learn.microsoft.com/en-us/azure/azure-cache-for-redis/cache-web-app-howto.

You can confirm that Redis is online and running via the Redis console in Azure.webcacheas

Click it and issue the INFO command.

INFO command

TIPS FOR USING AZURE REDIS CACHE

Here are some general tips when using Azure Redis Cache

  1. The location of the Redis Cache service should be close to your application.
  2. Data structures within Redis mean that larger key-value sizes lead to fragmentation of memory space. And these larger memory requirements mean more network data transfer. Redis states to use 100KB maximum.
  3. Creating connections is expensive (can cause latency) so reuse connections or at least close older ones before releasing them.
    • You should be thinking about expiration times and expiration sliding strategies.
  4. Some Redis commands are single threaded so use them with caution, for example, KEYS command may ruin performance.
  5. You need to cater for possible failover issues. An unplanned failover might happen because of hardware failure, network failure, or other unexpected outages to the primary node. Developers should use the runtime maintenance notifications on a publish/subscribe (pub/sub) channel called AzureRedisEvents. Many popular Redis client libraries support subscribing to pub/sub channels. Receiving notifications from the AzureRedisEventschannel which is usually a simple addition to your client application.
  6. Consider tweaking memory policies to improve system responsiveness. There are 3 key settings. These are called max memory policy, max memory reserved and max memory fragmentation memory-reserved. Microsoft covers these settings in depth here: https://learn.microsoft.com/en-us/azure/azure-cache-for-redis/cache-configure#memory-policies.

CONCLUSION

As you can see from this blog post, Azure Redis is a very mature cache technology which can be built with ease via the Azure Portal. Well-built integrations with components such as Azure Storage, private link, managed identity and support for various languages means building a cloud-native solution at a cost-effective price point is very much possible

If you’re interested in Transparity helping you implement in Azure Redis Cache, just get in touch.

In 2022, data analytics continues to deliver essential insights to global industries and enterprises.

With the rise of Big Data analytics, and recent developments in predictive analytics continuing to deliver enhanced intelligence, analytics has never been more present, more accessible, or more advanced. 

Taking a closer look at the analytics industry in 2022 and beyond, we’re exploring some of the most revealing, current data analytics statistics and trends. Read more below. 

#1: Data-driven companies are 58% more likely to beat revenue goals 

A recent Forrester article emphasises the importance of data-driven insights on growth. With access to greater intelligence, companies can gain insights and realise more opportunities for growth. They can identify the core trends to capitalise on – and the serious risks to avoid. 

Learn more about the importance of data-driven insights for true business growth. 

#2: 62% of retailers report gaining a competitive advantage from information and data analytics 

IBM discovered that data analytics are actively aiding retail companies to get an edge in increasingly competitive and saturated markets. Enabling retailers to create and deploy strategies based on fact and insights rather than intuition, analytics helps retailers actively navigate slim margins of error. 

Learn more about how data analytics can empower retailers in our free ebook. 

#3: Predictive analytics are on the rise 

With more awareness of and access to capabilities, organisations can take advantage of predictive analytics to forecast potential future trends. Empowering businesses with strategic intelligence for decision-making, the predictive analytics market is expected to reach $22.1 billion by the end of 2026 according to a recent report. 

#4: Data-driven culture matters 

We fully understand the importance of a data-driven culture, knowing from experience that it helps to reduce the risk of data silos and unreliable data. Nevertheless, in a recent study, over 60% of organisations stated that company culture is their greatest barrier. 

Want to learn more? Visit our blog to discover the importance of a data-driven culture. 

#5: Efficiency is the core benefit of analytics 

Finances Online reported that out of all companies interviewed, 63% believe that improved efficiency is the number one benefit of data analytics. In the same survey, 53% of companies also believed that more effective decision making was a core priority. 

#6: Unstructured data is still a critical challenge 

In an economy fuelled by data, only those with compliant and optimised data will be able to successfully navigate the market, as well as adjust their business strategies. 

Unstructured data is more complex and costly to both manage and integrate into analysis processes. According to Forbes, 95% of businesses cite the need to manage unstructured data as a problem for their business, with those able to do so gaining a significant competitive advantage with clearer visibility. 

#7: The global Big Data and data analytics market is booming 

In 2022, the worldwide Big Data and data analytics market was valued at just over 274.3 billion USD. 

As more and more enterprises realise the importance of analytics, this market rise isn’t expected to decline any time soon. With a continuing focus on the importance of Big Data analytics to help drive business strategies, Big Data analytics is here to stay. 

#8: Budget is blocking Big Data 

Topping off a list of barriers that include integration challenges and a lack of expertise, budget remains a key blocker to entry to Big Data analytics. But just how true is this? 

We recently released an eBook aimed at tackling this issue directly – exploring three analytics projects at three varying investment levels. Download it to learn more. 

#9: Businesses now believe that an inability to take advantage of big data will cause bankruptcy 

Reinforcing the importance of Big Data analytics, Accenture’s study on the success rates of Big Data analytics found that more than three quarters of enterprises believed that those who fail to take advantage of it may severely negatively impact their business. 

#10: Companies embracing data analytics could increase operating margins by 60% 

McKinsey’s study on Big Data found that businesses which prioritised data analytics could increase their operating margins by up to 60% – showcasing the importance of being able to harness data analytics. 

Committed to delivering data-driven growth 

As a specialised BI consultancy, we’re committed to empowering enterprises to gain actionable insights from their data – implementing value-led projects to enable strategic intelligence. 

To learn more, discover how we’ve previously empowered clients in the past in our case studies hub. 

This year, Microsoft’s key focus was on artificial intelligence (AI) and it’s embedded AI-assisted features across their product portfolio, termed Copilot.  What is clear is that AI has landed, it’s production-ready and is ushering in a new era of productivity for businesses. 

As businesses continue to grow and evolve, the role of data engineering services becomes increasingly important. 

Helping to create reliable data pipelines, architect data sources, and implement enhanced automation workflows wherever possible, data engineers can help optimise the functionality and reliability of data throughout an enterprise. 

In doing so, they can create consistent outputs and encourage a data-driven culture enterprise-wide by communicating the value of consistent results. 

From small regional businesses seeking secure and consistent outcomes to large multi-tiered enterprises in search of increased functionality and efficiency, data engineering services can bring significant benefits. Read on to learn more. 

Defining data engineering 

At its core, data engineers design and create an infrastructure that transforms and transports data into a highly usable format for end users – be it key stakeholders or internal sales teams. 

The role of a data engineer, however, does not end there. Enabling the flow of traffic between systems, data engineers make data functional, consistent, and of higher quality in the process. They work collaboratively with enterprises to understand both the long and short-term challenges to the business, and also the underlying architecture. 

With this knowledge of business objective and current infrastructure, data engineers can build intelligent and advanced solutions that not only cater to current challenges but anticipate future risks that accompany ongoing growth and expansion. Common examples of data engineering services include: 

  • Extracting data from different data sources into a data lake or data warehouse 
  • Enabling the ability to consume data in real time 
  • Integrating multiple systems together such as order management and finance systems 

Automating workflows 

Automation is the perfect answer for enterprises in search of efficient and streamlined data flow. For internal teams, automated workflows can enable the seamless transfer of data into a wide array of systems, bringing several significant advantages with it in the process. 

Reducing risk 

Traditionally, the transfer of datasets between systems, or their initial input, may require lengthy and error-prone manual processes. However, with manual processes being replaced with automated workflows, teams can reduce the risk that accompanies legacy processes. 

As a result, users across all levels of an enterprise can place more trust in data-driven insights, with enhanced consistency and accuracy. 

Greater capacity and productivity 

Another significant benefit of data engineering-enabled automation is in the capacity and resource it returns to internal teams. While users traditionally may have had to devote large portions of their time to inputting and transferring data, this is no longer necessary. For the core user, this gives back valuable time to focus on other important tasks, boosting overall efficiency for teams and the business as a whole. 

Building sustainable architecture 

As the scale and complexity of a business grows, so too does the number of potential risks and threats. 

A specialist data engineer can design and implement solutions for a business’s architecture that can resolve not just the initial identified challenges, but support ongoing growth and scaling. 

With additional data sources and formats naturally accompanying architecture growth, new unexpected challenges may arise, affecting processing time, and security. To mitigate this risk, it’s important to design intelligent solutions that can handle ongoing pressure, challenges, and demands. 

Whenever we work with new clients, the first step that we embark on is to understand the individual complexities, risks, and challenges of underlying architecture for this very reason. 

By successfully negotiating any possible risk and creating reliable foundations at the beginning of our involvement, we can integrate intelligent engineering services that won’t fall victim to possible disruption. For internal teams, and stakeholders, this provides cost-effective and scalable solutions, with long-term confidence in infrastructure secure. 

Helping teams with capacity 

Internal teams may struggle with capacity due to a number of reasons. Internal IT teams may possess the skills and capabilities to construct intelligent data engineering solutions, yet they may not have the necessary capacity to do so. 

Through our services, we work with these internal teams to help alleviate pressure, designing and implementing these solutions while prioritising trust and communication throughout the entire process. This means that IT teams will be familiar with our solutions, and able to maintain and even further optimise them, long after our partnership has come to an end. 

For internal teams that may not possess the knowledge necessary to maintain and upgrade systems in the long term, we also offer a fully managed, retained approach to data engineering services. 

This approach enables us to provide ongoing and long-term support for your enterprise’s architecture. Meanwhile we will maintain and suggest further improvements to your infrastructure without decreasing the capacity of internal teams and users. 

Committed to delivering trusted intelligence 

Throughout our involvement across a wide range of solutions and services, we use a value-led methodology that provides measurable benefits for enterprises across a range of industries. 

From implementing a data warehouse to facilitate centralised reporting, to building pipelines between previously isolated systems, our range of services are designed to elevate intelligence, and establish confidence in your data throughout all processes – enabling a more advanced approach to strategic decision-making. 

To learn more about how our solutions can help reach your business objectives and support internal teams, contact us today.

Our people are what makes Transparity special. So, we’d like to introduce you to some of our team so you can get to know the people who make it all happen and get their take on Winning from Anywhere. Introducing, Nicole Warren Senior Digital Marketing Executive.

 

Background and a bit about you

In 2012 I started doing a degree in Spanish Studies in hopes to becoming a translator in the UN, being half Spanish and able to speak the language, this was quite an easy route for me to take into professional life.

However, after spending a year abroad in Spain teaching and translating, due to some unfortunate circumstances, I quickly realised that this was not for me. At the time, I had my own fashion blog and spent a lot of my spare time keeping it updated and building up a following. Because of this, I decided I would jump into a digital marketing masters – how I got onto it is still a miracle to me!

After graduating, I landed myself a job with AMT. An industry very foreign to the fashion world, but the skills were very much transferable. 

How did you find joining Team Transparity after AMT was acquired?

In all honesty, I’m not a huge fan of change and after being told that we had been acquired, I was scared – as expected. I didn’t know what was going to happen, who I was going to be working with, how I would be working (in the office or remote) etc. From working in a very small, close knit team to working for a company with over 150 people, the jump was big.

Little did I know that the team I was joining was perfect for me. Everyone was incredibly supportive and keen to get to know me despite not working together in an office and being across the country from each other. The marketing team was amazing (only two at the time) and they helped me more than I could have hoped for when getting to grips with a marketing engine much bigger than I was used to!

How has Winning From Anywhere® impacted your work and home life?

Winning from Anywhere® has made my work life balance so much better! I have family in Spain that I had not seen for 3 years due to the pandemic, now thanks to being able to work from anywhere, I am able to spend plenty of time in Spain. So right now, I am writing this blog, but in approximately two hours I will be going to my aunt and uncles house for a BBQ and a dip in the pool.

It’s not only the fact that I can work from anywhere in the world, but also the fact that I can work from home and go into the office whenever I need to. That means I can cut out commute times and do more of what I love outside of work, making me happier when I’m in work.

How have you found embracing the Winning From Anywhere® way of working?

In all honesty, when I started working remotely, I was SUPER excited – the fact I could cut out my horrible commute to the office and have that extra time doing other activities (or simply staying in bed that tiny bit longer) was great. However, over time I came to realise that in order to ‘Win From Anywhere®’ you have to be disciplined and organised. Distractions are very real!

This was a particular problem for me in Spain due to that fact that I had my (very loud) family around me, all the time. To combat this, I made sure I worked when everyone was at the beach, or just out of the house. Sometimes this meant working weird hours but having the discipline and the trust from my team that I was doing the work, meant that I was able to work flexibly with less distraction.

How have you found building relationships with colleagues while Winning From Anywhere®?

A challenge with Winning From Anywhere® is the fact that we can’t build the constant face-to-face relationship with our colleagues that we would in the office. I’ve found that I have had to be more intentional in getting to know people I work with – regular meetings talking about topics other than work really help! It’s thanks to Teams messages and video calls that we can still build that connection with others despite the distance.

What would you tell future Transparity team members about Winning From Anywhere®?

Top Tips from me on how to make the most Winning From Anywhere®:

  • Make a clear list of tasks you want to be completed by the end of day as a non-negotiable.
  • Try to have a separate work area/room if you can with minimal distractions.
  • TAKE BREAKS (I am still working on this one!)
  • Don’t just talk to your colleagues about work – you have a life too and so do they!

What are your career highlights so far?

To be honest, I have quite a few, but one of my career highlights so far has to be building AMT’s marketing up from next to nothing.

When I joined, there wasn’t really much going on so I was able to make the role mine and run with my ideas to where it got to before we were acquired by Transparity. I like to think that the AMT website is what attracted Transparity to us!

What’s your favourite thing about working at Transparity?

My favourite thing about working at Transparity is the enablement they provide to get me to my career goals. I love PPC (Pay Per Click) and SEO (Search Engine Optimisation) and although I came from being a marketing ‘generalist’ in AMT, I have found that this is an area that I want to excel in and I have been given the support to be able to push myself in this area. From the courses I will be enrolling on, to working flexibly around my studies.

Transparity care about what I want my future to look like.

What do you like to do when you’re not working?

When I’m not working, I love to socialise with my family and friends, whether that’s a BBQ, going out for drinks or going to the beach. I love paddleboarding (if you live on the south coast and don’t take advantage of the sea, you’re missing out!) and try to get out as much as I can.

I also have a small beauty business and transform people’s eyebrows, but I see this more as a hobby because I enjoy it so much!

Are Enterprises taking advantage of the full benefits of Microsoft Azure? 

Boasting a comprehensive toolbox for a wide range of services – from AI to security and governance – Microsoft Azure can help enhance the capacity, capability, and confidence of internal teams throughout an enterprise. 

But with so many Azure services to choose from, it can often be difficult to deploy the very best tools for your architecture. The Azure platform is a comprehensive set of tools, from simple cloud storage to advanced AI & ML capabilities. Choosing the right service for your data and analytics architecture is vital, as incorrect setup or choice can negatively affect the quality of your end results and objectives. 

Acting as an outsourced Azure data integration partner, we’re able to offer complete solutions that start at initial audit and assessment, and end at ongoing maintenance and evaluation. 

These integration services move beyond the implementation of single components such as Data Factory and SQL Databases, and towards specialist bespoke solutions. As a result, we can help enterprises and teams take advantage of the full benefits of the Azure ecosystem. 

Microsoft Azure: The advantages 

There are a wide range of benefits that come with a successful and specialised Azure integration – with each one capable of helping empower teams for the better. Two core advantages include flexibility and the ability to automate processes. 

Flexible approaches 

Azure facilitates the ability to perform the same function in several different ways according to your specific needs and compute requirements. When in need of additional resources, this can be created quickly and cost-effectively, with users being able to control resource as easily as selecting the next option on a slider. 

This means that enterprises only pay for compute resources when needed, granting greater flexibility compared to traditional on-premises solutions that require consistent maintenance and scaling. 

However, this flexibility can also have an impact on budget, as the same function completed for an investment of £5,000 may be possible on £500. Understanding the best pathway to grant the greatest ROI can be difficult, however, our Azure services can help aid you through this process to reach the best value for your enterprise. 

Automated practices 

With a fully functional and value-led Azure integration, teams can take advantage of the potential that Azure-based automation has to offer, as data is transferred from a dizzying array of sources via Azure Data factory to platforms such as Azure SQL databases, Azure Synapse, & Databricks. 

Substituting time-consuming and manual processes with rapid automation gives valuable time back to staff to focus on other projects or priorities. 

Additionally, it may even help to reinforce the quality of the data and insights generated, mitigating the potential for errors to affect reliability. 

What’s the role of an Azure data integration engineer? 

An Azure data integration engineer holds essential skills to accelerate and enhance the movement, transformation, storage, and use, of Azure-based data. 

The engineer’s main role is to choose the correct Azure components to implement and deliver a proposed end-to-end solution. The specific components chosen can be influenced by a wide range of factors, such as: 

  • The volume and complexity of data 
  • Data latency 
  • The project budget and scope 
  • The specific use case and objective 

While the above describes the role of an Azure data integration engineer, the reality is much more complex. The data integration engineer should help a customer or a business to build a technical solution that solves a specific business problem, using the services available in the Azure product range. 

With hundreds of use cases to decipher, the input from a data integration engineer cannot be overstated. 

Consistency, clarity, and confidence 

We’re progressively seeing our Azure integration services becoming the new starting point, and standard, for wiring up data between database platforms and analytics services like Azure Synapse Analytics. In addition to these projects, we’ll often use Databricks or Azure Data Factory for managing data transformation. 

But what does this mean for businesses in reality? 

These integrations create harmony between different departments, and different platforms such as logistics, finance, and marketing. With the correct integration and layout, harmful silos are removed and replaced by a collaborative, communicative, and clear overview. 

Our integration services can also be performed in tandem with internal IT teams. Struggling with capacity in demanding environments, these teams may possess the skill sets needed to implement, maintain, and support Azure implementations, but don’t have the resources needed. 

As trusted Azure partners, we can provide a collaborative and comprehensive integration solution – engaging with IT teams or internal stakeholders while implementing and coordinating successful delivery, from start to finish. 

To begin your journey to an enhanced Azure environment, get in touch today. 

The financial services industry has long been a target for cybercriminals hoping to access the sensitive and valuable information they hold. The majority (56%) of data breaches come from external threat actors with 96% of all breaches financially motivated, according to Verizon’s 2021 Data Breach Investigations Report.

Recently, ransomware has emerged as a growing threat to the financial services industry, with Microsoft reporting “a massive growth trajectory for ransomware and extortion”. They also report that the financial and insurance sector sits in the top 3 most targeted industries for ransomware attacks.

Ransomware attacks can be devastating both financially and to an organisation’s reputation. Ransomware is a malicious type of software that blocks an organisation or user’s access to their data by encrypting it until a ransom is paid. These attacks target businesses of all sizes and can be extremely lucrative for cybercriminals. In May 2021 it was reported that CNA Financial, one of the USA’s largest insurance companies, paid an astonishing $40 million to recover their data after a ransomware attack.

How to defend against ransomware in the financial services industry

Adopt the Zero Trust philosophy

Cybersecurity is essential for businesses of all sizes to mitigate the risk of a cyberattack. The best defence against ransomware is a robust security set up based on the three pillars of cybersecurity – Zero Trust, least privilege and assume breach.

  • Zero Trust – to never trust anyone and always ask them to verify their identity
  • Least privilege – once verified, to only provide them access to the things they absolutely need, and only for the minimum amount of time required
  • Assume breach – to always assume that any protection will fail, through either user error or system fault

These pillars make up the Zero Trust philosophy which is fundamental to a solid defence. In fact, 96% of security professionals see it as critical to their organisation’s success.

Use Microsoft’s security toolset

Microsoft’s extensive security suite has all the tools you need to maintain the security of your organisation. From Microsoft Sentinel for monitoring your environment to Microsoft Defender for Endpoint to secure users’ devices, Microsoft has the tools you need to keep your data secure end-to-end.

Reduce your risk

Microsoft recommends limiting the scope of the damage and working to remove security risks as top priorities for anyone looking to reduce their risk of a ransomware attack. Limit the scope of the damage and make it harder for attackers to access multiple essential systems by establishing the Zero Trust method in your security set up. Then, work to remove the security risks that may leave you vulnerable, starting with implementing Multifactor Authentication (MFA) to keep user devices secure.

Keep your systems and data secure

Cybersecurity is a continuously evolving challenge, as cybercriminals develop new, more sophisticated ways to gain access to valuable data. To stay ahead of the threat, organisations need security experts at their side maintaining and perfecting their defences.

Our Managed Security Service is built on the core principles of Zero Trust, informed by the latest threat intelligence to stay ahead of emerging risks. Our experts work proactively to close vulnerabilities and continuously improve your security posture with 24×7 support, so you can be confident in your security.

Security guidance tailored to your requirements

Take advantage of Microsoft funded workshops for in-depth guidance from our security experts. Explore Microsoft’s extensive security toolset, analyse current threats and create a strategic security plan to protect and govern your organisation’s data. Get actionable next steps to improve your security posture and put your questions to our experts so you walk away with the insights you need.

Ransomware protection for customers in financial services

We partnered with a large asset management company who were looking to improve their overall security posture and ensure all sensitive company data was kept safe. They were looking for a security partner they could trust to proactively protect their systems and data from cyberattacks, 24/7.

The team opted for our Managed Security Service for complete peace of mind. Their environment was aligned to our Secure By Design blueprint, which includes detection, prevention and mitigation of security threats while proactively searching for potential vulnerabilities and resolving them. It’s a culture of comprehensive protection and continuous improvement.

In 2021 alone, our teams managed more than 2,000 security-related events including six critical vulnerabilities impacting businesses worldwide – successfully protecting the business from potential breaches.

“Having this reliability was one of the main deciding factors in adopting the MSS, as we knew we could rely on Transparity to provide us with the best possible service… [I] would highly recommend Transparity Managed Security Service to any companies looking to enhance their security, and feel safe knowing Transparity MSS is protecting the infrastructure 24 hours a day.”

IT Manager at a large weath management company

The next step in protection from ransomware for businesses in financial services

The threat of ransomware in financial services is growing and evolving, as cybercriminals develop their methods and become more effective in targeting financial data. While the risk is significant, it can be dramatically reduced through robust security measures and consistent security hygiene.

MSI Reproductive Choices Choose Transparity

We’re thrilled to be working with MSI Reproductive Choices to create a Global Data Warehouse (GDW). MSI is a not-for-profit organisation delivering essential sexual and reproductive health services across 37 countries worldwide and has delivered services to over 155 million women since 1976.

The development of this Global Data Warehouse will upgrade MSI’s reporting and analytics capabilities, reducing manual processes and saving them time. By creating a single centrally managed repository the MSI team can create on-demand reports quickly and easily. We began with a Proof of Concept (POC) before moving on to the roadmap for implementing the GDW first in the UK and then globally. Our dedicated not-for-profit focused account management team have supported MSI through the process and will continue to provide support until the GDW is fully rolled out later this year.

We’re pleased to be helping MSI streamline its reporting processes to support innovation and adaption across the global organisation.

Apache logLj 2 vulnerability (CVE-2021-44228) – Updated

15/12/2021 – Update

As our threat researchers continue to monitor the Log4jJ vulnerabilities new information and guidance will be surfaced, including the new CVE, CVE-2021-45046. This CVE deals with the remaining security vulnerabilities that were not patched in Log4J 2.15.0  to address CVE-2021-44228. CVE-2021-45046 has a CVSS base score of 3.7, which is less serious than the CVSS score of 10 in the original vulnerability CVE-2021-44228, however, there is still a risk of denial of service from this latest vulnerability.

The affected Apache Log4J software is used by many software vendors in their products, and by extension many applications may require updating in order patch the Log4J vulnerability. The security community is still working to identify all affected products. The Cybersecurity & Infrastructure Security Agency (CISA) are collating a list of known affected vendors and software which can be found here GitHub – cisagov/log4j-affected-db. This list is being constantly updated.

The latest advice for mitigation steps from the Transparity SOC is provided from the most up-to-date information from Apache, which can be found here Log4j – Apache Log4j Security Vulnerabilities.

 

Versions Affected: all versions from 2.0-beta9 through 2.12.1 and 2.13.0 through 2.15.0

 

Mitigation for CVE-2021-45046 and CVE-2021-44228

Transparity’s advice is to lead with patching:

  1. Upgrade Java to version 8.
  2. Upgrade Log4J to release 2.16.0 or higher (2.16.0 is the latest version at time of writing)

If you are currently running Java 7, an impending patch, 2.12.2 is being worked on and should be available soon. However, the Transparity SOC would recommend upgrading to Java 8 where possible.
(Log4j – Apache Log4j Security Vulnerabilities)

Apache do advise as a last resort, where patching is not possible, that the JndiLookup class is removed from classpaths where possible:
zip -q -d log4j-core-*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class

 

Additional mitigations:

  • To prevent attacks on a network level and the vulnerable Java service from downloading a malicious class file via LDAP, outbound connections from affected servers can be limited to trusted hosts and protocols to prevent the weak Java service from downloading a malicious class file via LDAP.
  • Block User-Agent string using WAF “${jndi:ldap”
  • Microsoft have posted an article with a workaround https://msrc-blog.microsoft.com/2021/12/11/microsofts-response-to-cve-2021-44228-apache-log4j2/

 

References:

https://logging.apache.org/log4j/2.x/security.html

https://github.com/cisagov/log4j-affected-db

https://cve.mitre.org/cgi-bin/cvename.cgi?name=2021-44228

https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-45046

https://msrc-blog.microsoft.com/2021/12/11/microsofts-response-to-cve-2021-44228-apache-log4j2/

| Transparity Managed Security Service customers are being proactively reviewed and remediated as information on this vulnerability is released and additional mitigations are identified. All other customers should contact the Transparity Service Desk or their Account Manager if they require assistance or advice on remedial steps in order that we can work with you to define your requirements and a suitable action plan.

13/12/2021

On the 9th of December 2021, a remote code execution (RCE) vulnerability in Apache logLj 2 was identified as being exploited in the wild. The vulnerability tracked as CVE-2021-44228 and referred to as “Log4Shell,” affects Java-based applications that use Log4j 2 versions 2.0 through 2.14.1. Log4j 2 is a Java-based logging library that is widely used in business system development, included in various open-source libraries, and directly embedded in major software applications.

Because the discovery of this exploit is so recent, many servers – both on-premises and within Cloud environments – have not been patched yet. We highly recommend that organisations upgrade to the latest version (2.150-rc2) of Apache logL4j 2 for all systems. CVE-2021-44228 is considered a critical flaw, and it has a base CVSS score of 10 – the highest possible severity rating.

 

Some of the software currently known to be affected:

  • Apache Struts
  • Apache Solr
  • Apache Druid
  • Apache Flink
  • ElasticSearch
  • Flume
  • ApacheDubbo
  • Logstash
  • Apache Kafta
  • Spring-Boot-starter-log4j2
  • Apache Maven
  • Cloud Manager
  • Apache Tomcat

 

Current Mitigations:

  • Disable message lookups for logging mechanism API functions.
  • Restrict access and protocols that Log4j2 permits via Lightweight Directory Access Protocol (LDAP) and the Java Naming and Directory Interface (JNDI)
  • Update to the latest version – Log4j2 2.15.0-rc2
  • To prevent attacks on a network level and the vulnerable Java service from downloading a malicious class file via LDAP, outbound connections from affected servers can be limited to trusted hosts and protocols to prevent the weak Java service from downloading a malicious class file via LDAP.
  • Block User-Agent string using WAF “${jndi:ldap”
  • From 2.0-beta9 to 2.10.0, the mitigation is to remove the JndiLookupclass from the classpath: zip -q -d log4j-core-*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class
  • For *any* version you can delete the JndiLookup.class

 

Microsoft have posted an article with a workaround:

https://msrc-blog.microsoft.com/2021/12/11/microsofts-response-to-cve-2021-44228-apache-log4j2/

 

Basic Update Steps:

The below steps are simplified and intended as a generic guide, each application may vary, please ensure backups are taken prior to updating where testing has not been carried out.

  1. Browser to https://logging.apache.org/log4j/2.x/download.html
  2. Download the 2.15.0 version applicable to your circumstance to your environment
  3. Make sure that to check the integrity of the file using the PGP signatures
  4. Standard process to update both the API and CORE paths to use the newer jars

 

References:

https://cve.mitre.org/cgi-bin/cvename.cgi?name=2021-44228

https://www.reddit.com/r/crowdstrike/comments/rda0ls/20211210_cool_query_friday_hunting_apache_log4j/

https://socprime.com/blog/cve-2021-44228-detection-notorious-zero-day-in-log4j-java-library/

https://www.oracle.com/security-alerts/alert-cve-2021-44228.html

https://www.eshlomo.us/hunting-log4j-with-sentinel/

| Please contact us here if you need help and support in mitigating this vulnerability.

Microsoft appoints Transparity as an Azure Expert MSP

Transparity have today achieved Microsoft Azure Expert Managed Service Provider (AEMSP) status. Azure Expert MSP is the highest Microsoft Accreditation level for Azure and independently validates Transparity’s expertise and quality of Cloud transformation services, covering planning, migration, optimisation, operation, and management of Cloud services on the Azure platform.

Transparity offers a dedicated portfolio of Professional and Managed Services, spanning the entire lifecycle of Cloud services running on the Azure platform. These offerings support customers to better assess, migrate, manage, protect, optimise, and innovate with a vast range of Cloud solutions including Infrastructure workloads, Azure DevOps, Azure Sentinel, Azure Virtual Desktop (AVD) and so much more.

This highly sought-after accreditation comes following Transparity gaining the Azure-based Microsoft Windows Server and SQL Server Migration Advanced Specialisation.

Having nine Advanced Specialisations, as well as Azure Expert MSP status, puts Transparity in an elite group of partners both in the UK and globally and confirms that Transparity’s experts can design, build and manage a secure, resilient and scalable Cloud infrastructure that empowers and enables any organisation and its users to perform to the best of their ability.

David Jobbins, CEO at Transparity, said:

“It’s incredibly exciting that all the hard work by Team Transparity has today paid off as we become Azure Expert MSP accredited. This award highlights our technical capabilities in the delivery of Azure solutions and services, but most importantly it recognises the effort that our team have invested in our processes, strategy, and ongoing engagement with our customers. We’re delighted to be joining this exclusive group who represent the very best Azure Managed Service Providers, and it positions us as one of the most accredited pureplay Microsoft Partners in the UK. Thank you Team Transparity!”

 

Orla McGrath, Global Partner Solutions Lead, Microsoft UK, said:

“Partners play a central role in Microsoft’s vision to support organisations across all sectors in their digital transformation efforts. Transparity has demonstrated its in-depth knowledge and expertise across the Microsoft portfolio and we are delighted to recognise the business as a Microsoft Azure Expert Managed Service Provider.”

 

Michael Wignall, Azure Business Lead at Microsoft UK, said:

“Transparity invest significantly in their Microsoft expertise and technical knowledge and I’m excited to see them reach Azure Expert MSP status.”

Book a free consultation

Transparity accredited with the Disability Confident Employer (Level 2)

Transparity is proud to announce they have hit another milestone in their inclusive employer journey, having being accredited as a Disability Confident Employer (Level 2), demonstrating Transparity’s ongoing commitment to greater inclusivity in the workplace.

On the 22nd November 2018, Transparity became the 10,000th Disability Confident member and attended 10 Downing Street to an exclusive event with Microsoft and other business leaders to collect their certificate. Three years on, Transparity are delighted to have gained Disability Confident Employer (Level 2)as their team are passionate about making the tech landscape more diverse and inclusive.

The Disability Confident employer scheme was created as a movement of change, that encourages employers to improve recruitment, retention and development of people with disabilities in the workplace. The scheme has now more than 20,000 members who are benefitting from this fantastic talent pool that offers highly skilled and hardworking staff.

Neil Tune, Chief People and Culture Officer at Transparity, said:

“We are delighted and very proud to be furthering our inclusion journey by gaining Level 2 as a Disability Confident Employer. Our people are what make Transparity an outstanding place to work, and creating a place where everyone has the opportunity to thrive is fundamental to our culture. We continue to ensure our workplace is as inclusive and diverse as possible, by putting our people first.”

Book a free consultation

Transparity announce Microsoft FastTrack Service

Transparity are pleased to announce their Microsoft FastTrack Service offering, which has been in development since becoming a FastTrack Ready Partner earlier this year – one of a handful in the UK. Transparity have a team of dedicated FastTrack specialists, who can help to support organisations on their digital transformation journey by leveraging Microsoft 365 technologies.

FastTrack is a Microsoft-funded initiative that enables organisations to accelerate their move to Microsoft 365. This service provides assessment, planning and advisory assistance to customers looking to adopt Microsoft Cloud solutions, and compliments Transparity’s traditional consultancy services. Transparity’s FastTrack service offering includes expert guidance and access to Microsoft SME’s (subject matter experts) to provide accelerated onboarding.

To help users optimise their investment in Microsoft 365, additional tools and resources such as Transparity’s Training and Adoption Portal (TAP) are available and have been designed to showcase and provide effective learning tools for available features.

Tim Hannibal, Chief Partner Officer at Transparity said: “Transparity’s FastTrack service is an exciting opportunity – supporting customers in implementing and adopting Microsoft 365 services. With so many now working from home, it’s imperative that we do our part in providing quick and effective access to Microsoft technologies, so that organisations can thrive during these unprecedented times.”

Marco Durante, FastTrack Manager at Transparity said: “It’s an exciting time for Transparity to showcase the capabilities that our FastTrack service has to offer. We’re very proud of the work we’ve been doing to help support our customer journeys. Our aim is to provide this service with full transparency so that organisations know what to expect and what they’ll gain from partnering with us.”

 

Visit Transparity’s Microsoft 365 FastTrack page here to find out more.

Book a free consultation

[mwai_chatbot id="chatbot-8s20vg"]
Skip to content