Kategoria: Technologies

  • Microservices as a transformation in software development 

    Microservices as a transformation in software development 

    In today’s world of technology, software development is becoming increasingly complex and dynamic. One of the most modern approaches to building applications is the microservices architecture. In this article, we will delve into the world of microservices and explain why they are essential, as well as the benefits and challenges they bring to developers and businesses. 

    Microservices are an approach to designing and building applications that involves breaking down the system into smaller, independent services called microservices. Each microservice represents a separate function or module of the application and operates as an independent unit. The concept of microservices architecture began to emerge in the early 21st century. Although the idea of dividing systems into smaller, independent services existed earlier, the term „microservices” gained popularity in the years 2010-2012. The initial works on this approach appeared in industry literature and among IT experts, and the concept itself began to be widely used in the context of modern system architectures. Since then, microservices architecture has gained significance and become one of the main trends in software development. 

    Microservices architecture brings several benefits that make it attractive to many organizations. Here are some of the main advantages of microservices: 

    Flexibility and scalability 

    Microservices allow flexible scaling of individual services according to the workload. Unlike monolithic systems, where scaling the entire application is necessary, microservices enable adjusting resources only where needed. 

    Easier manageability 

    By separating the system into smaller units, microservices are easier to manage. Each microservice has its own codebase, database, and infrastructure, facilitating monitoring, debugging, and system maintenance. 

    Faster deployment and iteration 

    Microservices enable quick deployment of new features because changes in one microservice do not affect the rest of the system. This allows for faster product iteration and the delivery of new features to customers. 

    Technological diversity 

    Each microservice can be written in a different technology, providing developers the flexibility to use the best-suited tools for a specific function. This increases flexibility in choosing technologies and frameworks. 

    More effective fault tolerance 

    In the event of a failure in one microservice, the rest of the system can continue to operate normally. This increases resilience to failures and minimizes the impact of issues on the entire application. 

    Easier team development 

    Each microservice can be developed and maintained by a separate development team, making team management easier and increasing flexibility in organizing work. 

    Cloud solution compatibility 

    Microservices work well with cloud computing, allowing easy deployment and scaling in the cloud. This enables organizations to efficiently utilize cloud resources. 

    Additionally, microservices provide a solid foundation for implementing modern technologies such as containerization and container orchestration, enhancing infrastructure flexibility and efficiency. 

    However, it is important to note that while microservices offer many benefits, they also introduce certain challenges, such as managing communication between services or maintaining data integrity. These challenges should be considered when designing and implementing a system based on this architecture. 

    The decision to adopt microservices in a project should be made after a thorough analysis of business specifics, project scale, and the readiness of teams for organizational culture changes. Consideration of flexibility in deployment, cost-benefit analysis, understanding the complexity of service communication is essential. Evaluation of the impact on project performance and alignment with security and compliance standards is also necessary. Drawing on industry experiences and analyzing the successes and challenges of other organizations implementing microservices can be helpful in decision-making. 

  • What benefits do automated tests bring?

    What benefits do automated tests bring?

    Modern technologies, regardless of the industry, often pose a challenge for creators to ensure quality in complex products. Automated tests offer a solution to these challenges by enabling quick and effective software checks. Companies, irrespective of their field, are increasingly leveraging test automation to enhance process efficiency and ensure better product quality.

    Tests

    The software development cycle of a single iteration comprises numerous steps, starting from conception and concluding with version retirement. Between the first and last stages—after the implementation of the concept but before the product installation and release—comes the testing phase. This phase can be likened to material durability testing and certification procedures. Initially, the software is tested under optimal conditions before being subjected to the worst possible configuration. Initially used as intended, it is then fed incorrect data or tasked with performing non-obvious operations. All this is done to ensure that the software can function predictably across various hardware configurations and will not lose stability should a user opt for less common options or make an error.

    However, manual installation, execution, and running of diverse sequences of actions are both time-consuming and insufficient. Hence, automated testing is gaining increasing popularity. What are they and what benefits arise from their implementation?

    Automation

    As the name suggests, it involves procedures that execute automatically. An automated test is a script executed within a designated program that navigates through various paths, performs operations on the tested software, and retains the results. Such an approach comes with several benefits for both product creators and entities commissioning its development.

    1. Repeatability. Automated tests are conducted by machines following a pre-programmed scenario. This eliminates a significant issue in manual approaches – the human factor. Each test looks the same every time, eliminating the possibility of skipping a step due to fatigue or carelessness.
    2. Speed. Conducting a single automated test is significantly faster than executing the same algorithm manually. This applies not only to entering data and initiating various options but also to activities following the completion of the procedure. This results in quicker readiness for retesting and, in the long run, the ability to conduct a greater number of trials within a given timeframe.
    3. Preciseness. Automated tests are immune to the cognitive errors that humans are susceptible to. As a result, they can detect potential errors in the code and software operation long before a person might notice that something is amiss. For instance, a manual tester might click on an option on a web page with the cursor, and if an icon doesn’t respond immediately but with a slight delay – from a user’s perspective, it might not even be a noticeable difference. However, in reality, delayed execution of a procedure could indicate a deeper issue in the code. An automated test measures the time between an action and its response, thereby not only recording how many milliseconds it took to execute a function but also providing developers with crucial information. This data allows them to scrutinize the problem and optimize the code before hundreds of small errors turn any page into another Facebook.
    4. Multithreading. Among the benefits of automation, one cannot overlook parallel testing. Contrary to common belief, humans are unable to focus on multiple tasks simultaneously; they can only switch rapidly between different tasks. Nevertheless, this shifting of attention between several activities can lead to errors and oversights, ultimately slowing down the pace of work. Automated tests, leveraging multi-threaded hardware and software architectures, are not limited in the same way as humans. It’s possible to concurrently execute multiple algorithms, which proves beneficial, for instance, when assessing a new product version for backward compatibility.

    Will automated testing programs replace humans?

    There are no perfect solutions, and replacing humans with machines doesn’t necessarily mean an increase in efficiency without negative consequences. Automated tests are indeed very fast and accurate, but their greatest advantage is also their greatest drawback. They lack the human factor. At present, the majority of software is still designed to be user-friendly. Therefore, UX (User Experience) remains crucial—placing buttons, the size of the area around icons allowing for imprecise clicks, interface clarity, and many other aspects. In these matters, humans remain irreplaceable.

  • Salesforce – CRM system implementation

    Salesforce – CRM system implementation

    Every rapidly growing company needs a unified customer relationship management (CRM) system. More often than not, its implementation can be problematic and takes a long time, as the software has to be adapted to specific needs, and an adequate computer infrastructure is required so that it can operate. Salesforce is one of the solutions that allows you to break out of this trend and offers easy-to-implement CRM systems.

    What is Salesforce Sales Cloud?

    It is an innovative solution based on the Software as a Service (SaaS) model. With this approach, the software is more flexible and its maintenance is no longer a problem.  This means that Salesforce is not only easy to implement, but also performs brilliantly over the long term. Nevertheless, this sounds like a typical marketing pitch with no reflection in reality, so let’s move on and focus on the specific benefits of Salesforce and the advantages of its service over the competitors.

    The advantages of using Salesforce:

    • Price and billing model. The cost of using the service depends on the complexity of the system and the number of people who use it. The basic package costs as little as $25 per month per user and provides essential tools such as mailbox integration with Outlook and Gmail, customer contact and opportunity management. There are also more comprehensive versions of Salesforce CRM available, offering added process automation or analytics mechanisms.
    • Scalability. The Salesforce CRM system allows you to purchase a basic plan and extend it when it is no longer sufficient. Unlike conventional desktop solutions, this does not involve expanding the infrastructure, adding extra hardware to increase computing power or a lengthy process of deploying more and more applications. All you need to do is purchase a more advanced service package.
    • Maintenance-free. Salesforce software is operated in the cloud under the Software as a Service model. This means that the customer gets ready-to-use, operational tools. Whenever anything does not function as it should, you simply report it and the system is repaired with no dedicated staff required on your part. In addition, cloud-based software reduces the risk of a major failure due to hardware problems – all data is backed up and other computing units are plugged in to replace any faulty ones.

    Broad range of products

    Salesforce provides end-to-end solutions for businesses. They are made up of a number of applications and modules that are integrated with each other, allowing a complete support for the business. In addition to the CRM discussed above, the following must be mentioned:

    • Service Cloud, which is a tool to help solve customer problems. You can use it to easily check the history of requests and consequently gain conclusions about the consumer’s needs. As part of this service, an up-to-date knowledge base and ready-to-use scenarios are made available to support staff, thereby increasing efficiency. Alesforce Service Cloud supports contact using multiple channels simultaneously, so that the customer is not constrained to using a single platform;
    • Revenue Cloud, namely a sales application consisting of two modules – CPQ and Billing. The former is mainly targeted at traders, who can use it to quickly create and send complete offers. Billing, on the other hand, provides support for staff responsible for billing and helps prevent the confusion that often occurs during busier periods. When a sales offer is accepted, a notification is sent, so you know immediately when to invoice and what the amounts should be. The Salesforce Revenue Cloud offers an added benefit – a set of tools that facilitate running an online shop, which will undoubtedly help retail-oriented companies;
    • Marketing Cloud – an advanced tool designed to personalise promotional activities. It enables creation of campaigns targeted at specific consumers and enables direct contact with them. Personalised activities increase sales efficiency, thus maximising profit with minimal input. This tool is supplemented by…
    • Marketing Cloud Account Management, which is an application used to automate marketing activities.  It allows you to track customer behaviour, which makes it easier to find consumers with the greatest potential. It is also useful for creating personalised campaigns. MCAM also enables automation of email communication.

    Why choose Salesforce?

    Introducing a first CRM system, as well as converting from one solution to another, requires you to act quickly. You cannot afford to introduce tools slowly or allow two incompatible systems to coexist for a long time – this leads to chaos in databases and results in problems in management and contact with the customer. Salesforce offers out-of-the-box applications that are very stable and easy to maintain, in addition to being easy to deploy. Still, regardless of whether the decision to switch to the system in question has already been made, we invite you to contact us. We are experienced in the implementation of Salesforce products, our specialists will be happy to help you choose the right service package or train your employees in using the new tools.

  • What do you know about Big Data?

    What do you know about Big Data?

    The term 'Big Data’ has been gaining more and more popularity over the past few years. Nonetheless, in what ways is this mechanism different from ordinary data sets? What are the benefits of its implementation? What are the risks?

    What is Big Data?

    A company needs to adapt to the market in order to function smoothly and grow. You cannot go about this blindly, as this is the direct path to failure. For this reason, entrepreneurs have always collected and analysed data. Based on complete calculations of production costs, they were in a position to optimise expenses later on, to determine margins independently for various goods, taking into account the demand. In the age of the internet and the socio-economic changes occurring faster than they ever have before, it has become obvious that selective data collection and manual analysis, even using computer software, is not enough.

    Big Data means a set of huge amounts of unstructured data, too large and complex to be analysed with conventional tools. Modern technologies such as artificial intelligence are used for processing here. To put it in a nutshell, unlike the previous solutions focused on recording selective matters such as demand for specific products over successive months, with Big Data you have a stream which also includes seemingly irrelevant information. This allows you to find patterns and relationships between various, seemingly unrelated factors. Smartphones, various sensors or wearable devices are used to collect such enormous amounts of data, to name but a few.

    What are the benefits of Big Data?

    • Rapid acquisition of knowledge. An analysis of large amounts of data from various sources allows you to find specific patterns that you can then use to maximise profits. Conventional research into customer behaviour and motivation requires more time than one conducted using Big Data. Until now, data had to be extracted manually first (e.g. through surveys). Today, a smartphone connected to the internet does this automatically. It collects and sends back data about the age group to which the buyers of services or goods belong. Then, you don’t need to manually scan through sheets for analysis, which also consumes time – with the help of AI algorithms, it is done in real time.
    • A broader view of the examined issues. The amount and type of data collected through the various sensors is almost unlimited. This allows you to look for correlations between seemingly unrelated matters when you attempt to analyse them. Unlike conventional methods, here you are not limited by the mental horizon of the individuals responsible for compiling the data.
    • Quick response to competitor activity. By using Big Data, you can monitor in real time not only the real time standing of your company, but also the activities carried out by other companies. This is perfectly illustrated by the example of online shops – when the price of an item drops significantly in one of them, most of the other major players on the market implement a similar change within a few moments. The whole process can be repeated several times in one day, which would not be possible without constant observation of the competition and real-time analysis of their activities.
    • Minimising losses in the company. Everyone makes mistakes occasionally, which can lead to financial losses, for instance. Besides, there is always the risk that an external factor will change and start acting to your disadvantage overnight. With Big Data, the response time to these situations is significantly reduced. Once specific measures have been implemented, you don’t need to wait for monthly reports to know whether the decision you have made was the right one. The analysis, which is carried out in real time, around the clock, allows to detect potential dangers or financial losses before they even become noticeable. This allows you to quickly repair a mistake or adapt to a changing situation while reducing the negative consequences.

    What are the risks of using Big Data?

    • The need to incur investment costs. Introducing Big Data into a company involves implementing the right infrastructure to acquire data from various sources, then filtering out the redundant stuff and analysing whatever remains. This requires using very powerful computers, for example through Cloud Computing, and purchasing the appropriate software, which is also not cheap. At the same time, simply implementing Big Data does not mean that everything will simply start just happening 'on its own’ and the company will start generating huge profits overnight.
    • Storage difficulties. Big Data consists of unstructured data, so it cannot be comfortably accommodated in traditional databases. The multitude of sources, on the other hand, makes it easy to exceed the limits of analytical capacity. In other words, there is more data than you can process, which in turn requires discarding some of the collected material before it can even be processed. This problem is widespread, even Google is facing it.
    • Security. The more data is collected and transferred, the greater the risk of unnoticed leakage. For this reason, it is extremely important that a company using Big Data has a thoughtful and effective cyber security policy. This involves measures aimed at protecting not only the company, but also its customers, for example through data anonymisation – that is, measures taken to ensure that no specific person can be linked to the data.
    • Excessive trust. Big Data is based on advanced algorithms and artificial intelligence. It can detect correlations that are so complex that they would be very hard to observe with smaller data sets or simpler analysis tools. Still, correlation does not imply cause and effect. It is easy to perceive that two values change in a similar way over a short period of time, but this does not at all prove that one thing results from the other. It is important to bear this in mind as you analyse the results generated by Big Data machines.

    Is it a good idea to invest in Big Data processing?

    Yes. Undoubtedly, modern data analytics technologies make it now possible to significantly increase a company’s productivity and profitability. That said, this involves costly investments that do not offer a 100% guarantee of maximising profits in the short term. You should bear this in mind and make the decision to invest in Big Data consciously, knowing not only the potential returns but also the risks.

  • Why is Kanban useful in IT?

    Why is Kanban useful in IT?

    Some time ago we described the DevOps methodology on the StackMine blog, its main premise being close collaboration between teams of developers. In order to effectively implement a programming process of this kind, it is essential to use the right tools and follow the defined procedures. One great solution to this issue is Kanban – a predefined framework that facilitates process transparency and an ongoing flow of information. However, to fully understand what it is all about, let’s start at the very beginning – that is, the idea of agile programming.

    What is Agile about?

    This methodology is based on working in short, intensive periods (sprints), during which a specific part of the final product is delivered. This way, it is possible to implement the software more quickly and to respond to evolving customer needs. Agile enables a shorter transition time between interactions compared to the cascade method. At the same time, this approach provides more potential for change, preventing the risk that the software will already be obsolete on the day when the final version is delivered.

    What is Kanban?

    The name of this solution comes from a Japanese term originally meaning a signboard or information board. Its aim is to ensure that everyone involved in a project can obtain real-time information on the status of its progress. This way, the process can be optimised, as tasks can be dynamically allocated to a team, so that the workload is not excessive but the team does not experience downtime as a result of not working at any given time. This is particularly important in sectors such as IT, where no physical goods are produced and it is often hard to get an idea of the current status of a project without proper visualisation.

    How does this work?

    One key concept in the Kanban methodology is the boards that present information on the progress of work. The boards consist of columns indicating the next steps of specific activities. There is no predefined board pattern, each team can create its own model. Note, however, that they must not be excessively complex, as this will compromise their clarity and, overall, make the work harder rather than faster. The basic columns include ’to-do’, ’in progress’ and ’done’ – but nothing prevents you from adding your own elements, such as ’needs testing’.

    On the boards, the team places individual elements to symbolise the units of work. This helps simplify the process – one item in a column is one activity. It is good practice to introduce a unit limit here. For example, if the 'to-do’ column shows 5 items to be completed in the near future, do not add more until the space becomes empty. This way, you can avoid the chaos caused by dealing with too many tasks at once.

    The workflow in the Kanban system goes from the point of commitment to the point of delivery. The former is a space where the team selects the ideas and suggestions it intends to implement from all the ideas collected. The point of delivery, on the other hand, is the final stage, which involves the consolidation of the various elements of the programme and deployment, i.e. the implementation of the current version and handing it over to the customer.

    Kanban tools.

    The simplest way to introduce the Kanban methodology is a simple, physical board and a set of sticky notes. There are, however, a number of tools available for more convenient and versatile management. The most popular Kanban programmes include Trello, Jira and Kanbanize. They allow you to access the board online, conveniently create columns as required, and set timeframes for each task.

    Why introduce Kanban?

    With advanced projects, it is easy to get stuck due to poorly planned activities, starting to work on too many things at the same time – or assigning tasks in a less than optimal way, resulting in the team being overburdened. Kanban lets you notice at which point in the process a bottleneck occurs, making it easier to allocate resources accordingly, resulting in greater efficiency and faster turnaround on the next stage of the project.

  • Business Intelligence- what is it?

    Business Intelligence- what is it?

    Business Intelligence (BI), is one of the most important use cases for information technology in a company. What does this name stand for? What are the benefits of using Business Intelligence?

    Data, information, knowledge

    In his book entitled 'Organisation Theory and Design’, Richard L. Daft defined data as being made up of numbers, words, phone calls or computer printouts, sent or received. Without being given the proper context, they mean nothing. Once they are used to improve their understanding, they become information. For instance, a person’s name and telephone number are merely data until you need to call that person. Knowledge, on the other hand, is defined as ’information in action’, that is, the result of the deduction made.

    Data can include, for example, financial reports covering the most recent several years. Information will then include a comparison of revenues from the sale of goods and services over specific periods, while knowledge will be the perceived correlation between the increase or decrease in the sold volume and the price and time of the year.

    The final stage is wisdom, which means the ability to use the knowledge acquired through the analysis of information. Examples include placing Christmas decorations on supermarket shelves in mid-November and increasing toy prices two weeks before Christmas Eve.

    There is no wisdom without data

    It is, however, difficult to predict which of the stored values will prove useful in the future. This is all the more important because often seemingly unrelated or even irrelevant data have a significant impact on the broader picture. This means that, in order to correctly analyse certain events, Business Intelligence uses Big Data. These are large sets of diverse and variable data aimed at generating knowledge, far too extensive and comprehensive to be analysed by humans.

    Statistics meets IT

    Data collected from multiple databases functioning within a company are aggregated in so-called warehouses, that is, structures optimised for a certain slice of reality. They are only intended for reading and analysis. Their contents cannot be changed, however, they are cyclically fed with further data sets and retain whatever has been transferred to them previously. This way, Business Intelligence procedures can be used to perform long-term analyses in order to gain developmentally relevant information. Tools used for supporting data warehousing include, for example, Amazon Web Services.

    Finding statistically significant information and drawing conclusions can be difficult, especially when you are operating using data spanning several years or looking for things that are not obvious. The contemporary IT allows you to build Business Intelligence systems that significantly facilitate this process. It is possible to create algorithms that produce comprehensible reports based on data that are periodically sent to the warehouse. Afterwards, BI specialists can use this information to examine specific indicators – for example through data mining. However, good knowledge of statistics is extremely important here. A properly written algorithm will be objective, but it is the person who ultimately converts information into knowledge. It is therefore their responsibility to draw the correct conclusions.

    What can BI give your organisation?

    Every company seeks to maximise its efficiency and profits. With small companies geared towards a narrow target group, this effect can be achieved quite simply, but the larger the player involved, the more variables affect its profitability. There is no room for operating without prior knowledge, a cross-sectional analysis of historical data is essential. However, it must be not only accurate, but also fast. This is only attainable with correctly implemented Business Intelligence technologies.

  • DevOps – what is it?

    DevOps – what is it?

    DevOps has become an increasingly popular concept in the IT industry in recent years. However, the prevalence of the name itself is not matched by the awareness of what it actually means. I will try to shed some light on that in this article.

    What is DevOps?

    The term DevOps is a portmanteau, a word cluster consisting of the words Development and Operations. It means a methodology for running IT projects that emphasises close collaboration between the teams responsible for software development (Dev) and those in charge of operations (Ops). However, without a broader discussion on the tasks of the aforementioned teams, this entire paragraph sounds like a pleonasm. The floor is made of floor and the butter tastes like butter.

    What are the responsibilities of the Operations department?

    The competences of that department are very broad. In a nutshell, it is responsible for the way the IT department is managed, in terms of both hardware and software. In other words, it decides in which direction the software needs to be developed. However, in order to set the correct trajectory, it is essential to collect feedback from users – this is one of the tasks of the Operations department as well.

    Other competences of this department include providing support, that is, running a helpdesk. This means that an Operations department has to stay in constant contact with both product users and developers. This means that we are dealing with two organelles within a single cell – the flow of information between them must be smooth and uninterrupted, and the functioning of one is dependent on the other.

    What if we removed the division?

    Let’s go back to the definition of the term DevOps, outlined in the introduction. Collaboration between two groups of people in the IT department, as close as if they had merged into one organism. What are the consequences of eliminating divisions in this way?

    The DevOps methodology significantly accelerates product development. This is made possible by the constant exchange of information between departments and by using an iterative working model. In contrast to the once common waterfall model, in which each stage of software development was treated as a separate task initiated once the previous one had been completed, here the processes of planning, analysis, design and implementation are intertwined. This makes the product production process more flexible.

    By using the DevOps methodology, a company can save time and money. Above all, it is capable of delivering a functioning product quickly. Obviously, this does not mean that the first iteration will bring about the final version of the software. However, by creating an MVP (minimum viable product), development can continue once the software is up and running, and already in use. In this way, the client is in a position to reconsider its assumptions and, potentially, redefine its needs. With the close collaboration of people, who in a standard working model would be split into two teams (Development and Operations), you don’t waste time creating a full version of the software, getting feedback, re-analysing and working from scratch again. Instead, feedback is collected at the very first iteration of the product and can be communicated to developers who are focused on quickly releasing further versions of the results of their work.

    Software life cycle

    Software must undergo development at all times. This applies to the implementation of new functionalities as well as simply fixing bugs or patching security gaps. These activities take place long after the first final version of the software has been delivered to the client. Again, the DevOps methodology significantly improves work on this level.

    The Operations department, in collaboration with the client, determines the course of development for the product. It helps you set goals and collect feedback – what is working well, what can be improved, which features you might want to add or remove. In close collaboration with the Development department, you can implement such changes seamlessly. Instead of waiting until there are enough suggestions that it is worthwhile to rewrite the programme from scratch, many successive iterations can be created, each solving one of the problems. This gives the client a feeling that its suggestions are taken seriously, as they are met with a swift response.

    Reducing the number of bugs

    Automation is an essential element of the DevOps methodology. It might seem that its purpose is to speed up work, but in fact, that is not what it is about in the first place. People, unlike machines, are not very good with repetitive tasks. As their experience increases, they perform them increasingly quickly, but this involves slipping into a routine that may lead to mistakes. When you’re developing applications where multiple technologies and resources are used, this is very risky.

    Automating the deployment processes of new software versions, especially when they appear very quickly one after the other, is absolutely essential. With dedicated tools, it is possible to mitigate risk – the team deals with developing the application, while an algorithm makes sure that the specific iteration is fit for publication. This allows you to ensure regularity in publishing. One example is Messenger, with new versions released every three days.

    Is it hard to implement the DevOps methodology?

    No. It is not necessary to make revolutionary changes to the company. The process of merging the Development and Operations departments and implementing the variety of new solutions does not have to happen overnight. It is a minimally invasive measure which does not force a comprehensive change in working habits. On top of that, the first benefits of switching to DevOps can be seen as early as 2-3 months after the start of the transformation in most companies.

  • Benefits of Cloud Computing

    Benefits of Cloud Computing

    The previous article outlines the basic concepts associated with Cloud Computing. It explains the technicalities of Cloud Computing and shows the differences between the various service providing models. This text explores the real benefits of outsourcing processes.

    Scalability

    The amount of computing power required to efficiently perform the tasks imposed on a company is not constant over time. This can even be observed on personal computers. Even if you buy a device that has a considerable power reserve, sooner or later you will find that it is insufficient. In other words, you purchase an expensive piece of equipment, the capabilities of which you only use to a small extent, until one day you realise that you could do with a more powerful machine.

    This is not an issue with cloud computing, since many companies give you the option to adjust the volume of resources you use. Consequently, during periods of lower traffic, it is cheaper to maintain the IT infrastructure, as less computing power is used – but, if necessary, more memory or more processor cores can be accessed at little cost. Stackmine’s migration of an automotive wholesaler’s sales platform to the AWS cloud is a great example of this. By using auto-scalable infrastructure with ELB technology, the Client no longer needs to worry about computing power – its amount is adjusted dynamically.

    Zero maintenance

    Equipment wears over time and, according to Murphy’s Law, a major failure will occur at the worst possible moment. When data is processed on company-owned computers and servers, this can pose a serious problem. It is therefore crucial to ensure that any problems are fixed as quickly as possible. This involves employing specialists who are on alert around the clock. Some of the larger companies decide to go for backup – spare machines that can take over tasks from the ones that are temporarily out of service. But what if the damage is caused by an external factor, such as power supply problems? At the end of May 2022, the transformer station at the power plant in Bełchatów failed, shutting down 10 of the 11 energy blocks. In order to prevent the risk of a blackout, you can put up your own electricity generators, but this is a costly investment and computers consume huge amounts of energy.

    By using cloud computing, you can gracefully bypass the said risks. The calculations are made on large supercomputer farms. This means that the data is backed up in many secure ways, and the computing power we use is provided not by independent units, but by several of the interconnected modules of a giant machine. This is, of course, a major simplification, but it explains the essence of unmanned cloud computing. When, for some reason, the power goes out, back-up power supplies are activated and you don’t pay millions for their installation and maintenance – the cost of operating them is only a fraction of the subscription price. If the mass storage fails, or any other hardware failure occurs, this is not a problem – other machines will take over the tasks, and since they work together all the time, you won’t even notice the moment of the switchover. On the software side, cloud technologies enable automation of many processes, thus streamlining work. For the aforementioned migration of the sales platform to the AWS cloud, the StackMine team used Aurora technology, so that the size of the database is adjusted dynamically as required. The deployment of the application was automated, which means less work and stress on the part of the Client.

    Price

    This is a somewhat controversial aspect. Indeed, if you compare hardware prices with official distributors, renting cloud power over several years appears to be more expensive overall than your own server. This is because you fail to consider additional costs that you do not think about at the first moment.

    First of all, as I mentioned in the first paragraph, hardware needs to be replaced over time with newer and more efficient equipment. At the same time, reselling used equipment that is several years old will not cover the cost of buying new machinery. At the same time, bear in mind that it is a good practice to have a data or computing unit backup in case of failure or a sudden, unforeseen increase in demand.

    In-house data processing means that you need to provide space for the machinery park. This could mean renting or building your own hall, which significantly increases the cost right from the start. With cloud computing, all that a company needs is an office and a fast internet connection.

    Having your own data processing centre means that you have to pay for your electricity supply – and, as I mentioned above, computers have quite an appetite for it. In addition, electricity prices in Poland are relatively high and this trend will not be reversed in the years to come, as it is directly linked to the high carbon intensity and low efficiency of our energy sector.

    Ultimately, when you pay the subscription for cloud computing services, such as AWS, you get technical support from experts who make sure the hardware works as it should. In case of in-house data processing, you need to employ people in-house. As you can see, in the big picture, it appears that cloud computing is a cheaper (and certainly more financially convenient) alternative with stabilised prices.

    Migration

    The computers used for cloud computing are not the same as our desktop computers. On this hardware you cannot just install the software you are already using and work from a remote desktop. Many systems need to be rewritten again, in a platform-appropriate way. This may raise concerns about taking a step forward, but fear not – you are on the Stackmine website! We have the experience, knowledge and a number of successful migrations to the cloud to our credit. Simply write to us and we will do the rest.

  • What is Cloud Computing?

    What is Cloud Computing?

    Nowadays it is very difficult to run a business without being backed by technology. Powerful computing machines help us speed up a lot of activities, by relieving people of the burden of making complex calculations. Intel’s late 2021 flagship processor, the i9 12900K, can perform more than one billion floating point operations per second while maintaining a footprint of 45mm by 37.5mm. The 1.65 cm^2 microSDXC standard memory card can hold up to 512 gigabytes of data, which translates to almost 65,000 pages of Microsoft Word documents. In comparison, when printed double-sided on A4 paper weighing 80 g/m^2, the height of this stack would be 325 cm.

    Technology costs money.

    This is being felt particularly clearly now, at the time of a global economic crisis and a shortage of semiconductors. For this reason, it is an increasingly popular strategy to shift away from acquiring in-house computing units to renting resources that are available remotely. Performing calculations, providing IT services and storage of data using machines that are remote from the customer is referred to as cloud computing.

    How does this work?

    Cloud computing is a concept that includes all operations that take place beyond the local network and its firewall. In other words, performing calculations using a private server owned by a company and located on its premises does not represent cloud computing, although the activities are performed outside the client computer. Cloud computing can be based on one of several models.

    • IaaSInfrastructure as a Service. In other words, the client gets a unit with a specific computing power and with specific security measures. The quantity of resources made available is flexible to some extent – for instance, Amazon within AWS offers a choice between several pricing options, as part of which it leases the relevant amount of memory and processor cores. Within this model, the cost of purchasing a necessary software licence and installing it is incurred by the client. IaaS is the most flexible of Cloud Computing solutions, yet it requires the most IT expertise.
    • PaaSPlatform as a Service. In this model, the client is given access to a pre-configured working environment inside which they can easily create their own customised applications. The service provider is responsible for maintaining the operating system, keeping it secure and fixing any bugs. This model is more convenient for the client than IaaS, but offers fewer options – for example, as a consequence of the imposed specific operating systems and development environments.
    • SaaSSoftware as a Service. It is often referred to as ’cloud apps’. In this model, the service provider provides all the infrastructure and software. The client only has access to an interface that enables it to operate the application, often via a dedicated website. SaaS allows very limited intervention in the software. In return, it offers a highly simplified setup, often reduced to setting up a user account and defining a payment method. It works very well for universal tasks such as keeping spreadsheets or a cloud drive.

    Is it a good idea to transfer your processes to the cloud?

    I guess that the foregoing simplified description of how cloud computing technology works may leave the reader with some questions – is it advisable to invest in the cloud? After all, this frequently means that much of the software it uses must be developed from scratch. The Stackmine team has already helped many companies make this transition, always with a successful outcome. Therefore, if you are determined to make the move to the cloud, but don’t know who to ask for help with the migration, you have probably just found a suitable partner.

    We invite you to check out our offer: Cloud Services at Stackmine.