× Limited Time Offer ! FLAT 20-40% off - Grab Deal Before It’s Gone. Order Now
Connect With Us
Order Now

Data Visualisation Coursework Assignment Sample


You are asked to carry out an analysis of a dataset and to present your findings in the form of a maximum of two (2) visualisations, (or a single (1) dashboard comprising a set of linked sub-visualisations) along with an evaluation of your work.

You should find one or more freely available dataset(s) on any topic, (with a small number of restrictions, see below) from a reliable source. You should analyse this data to determine what the data tells you about its particular topic and should visualise this data in a way that allows your chosen audience to understand the data and what the data shows. You should create a maximum of two (2) visualisations of this data that efficiently and effectively convey the key message from your chosen data.

It should be clear from these visualisations what the message from your data is. You can use any language or tool you like to carry out both the analysis and the visualisation, with a few conditions/restrictions, as detailed below. All code used must be submitted as part of the coursework, along with the data required, and you must include enough instructions/information to be able to run the code and reproduce the analysis/visualisations.

Dataset Selection

You are free to choose data on any topic you like, with the following exceptions. You cannot use data connected to the following topics:

1. COVID-19. I’ve seen too many dashboards of COVID-19 data that just replicate the work of either John Hopkins or the FT, and I’m tired of seeing bar chart races of COVID deaths, which are incredibly distasteful. Let’s not make entertainment out of a pandemic.

2. World Happiness Index. Unless you are absolutely sure that you’ve found something REALLY INTERESTING that correlates with the world happiness index, I don’t want to see another scatterplot comparing GDP with happiness. It’s been done too many times.

3. Stock Market data. It’s too dull. Treemaps of the FTSE100/Nasdaq/whatever index you like are going to be generally next to useless, candle charts are only useful if you’re a stock trader, and I don’t get a thrill from seeing the billions of dollars hoarded by corporations.

4. Anything NFT/Crypto related. It’s a garbage pyramid scheme that is destroying the planet and will likely end up hurting a bunch of people who didn’t know any better.


The data used for this reflective study is from the World Development Indicators. In this, the dataset consists of information regarding the trade business, income factors for different regions and countries and income groups as well. So, a dashboard is created for assignment help with the help of Tableau Software using the two datasets, named as counry and country codes. The form of presentation used is a bar graph (Hoekstra and Mekonnen, 2012).

1. Trade data, Industrial data and Water withdrawal data vs regions.

Figure 1: Region vs Trade data, Industrial data and Water withdrawal data.

The first visualization created is about the Trading data, Industrial data and Water Withdrawl data. All three data are presented together with a comaprison in different regions to get an overview of all the regions and their holding place in the following trading sectors. For the Tading data in several regions, it is clear that the leading area is europe and central asia, with the maximum occupancy of 98,600, while the count is nearly equal to the water withdrawl count with a differnce of 311 only. But in this region, the industrial count is only 82,408, yet the highest in all data taken.

The next leading region is Sub Suharan Africa, which is only for the Tade data and Water Withdrawl data. While the leading region for industrial data is Middle East and North Africa.

Overall, these findings suggest that Europe and Central Asia offer the most significant opportunities for businesses and organizations in terms of Trading and Industrial sectors. Meanwhile, Sub-Saharan Africa and Latin America and Caribbean offer promising opportunities in the Trading sector, and the Middle East and North Africa have potential in the Industrial sector.

These findings also highlight the need for policymakers to focus on improving access to resources and infrastructure in regions where the count of these data is lower, to boost economic growth and development. The visualization depends on several factors, such as the choice of visual encoding, the clarity of the labels and titles, and the overall design of the visualization. Therefore, it can be considered as a successful visualization.

Moreover, the visualization provides a comprehensive overview of the data, allowing viewers to understand the relationships and patterns between the different sectors and regions. The correlation of Exchanging, Modern, and Water Withdrawal information in various locales additionally permits watchers to rapidly recognize what districts are driving in every area and which ones have potential for development.

The analysis provided in the visualization also adds value by highlighting the implications of the data, such as the need for policymakers to focus on improving access to resources and infrastructure in regions where the count of these data is lower to boost economic growth and development. This contextual information helps viewers to understand the underlying causes and implications of the data, providing a more complete picture of the situation (Batt et al., 2020).

Furthermore, the analysis provides insights into the regions that offer the most significant opportunities for businesses and organizations in terms of trading and industrial sectors, and the regions that have potential for growth and development. This information can be valuable for policymakers and stakeholders looking to invest in or improve infrastructure and resources in these regions.

The visualizations are well-designed, using different colours to represent a group, with proper labels and tags on it to make it easily understandable for the viewers, so it is a success on the completion of the visualizations. Although, it is important that additional analysis and contextual information may also be required to understand the underlying causes and implications of the data.

2. Source of income and expenditure cost for different income groups and regions.

Figure 2: Count of source of income and expenditure cost for different income groups and regions.

This visualization is about the income group in different regions and the comparison of count of source of income and the expenditure in total. This provides a well information of the data for all class of groups regarding income and their data.

One key observation is that the lower middle-income group seems to have more balanced results compared to other income groups. However, there are still significant difficulties faced by people in South Asia, where the count of income sources is low for all income groups.

Another important observation is that Sub Saharan Africa appears to have the highest count for the source of income overall, while Latin America and the Caribbean have the highest count for the upper middle-income group. On the other hand, the Middle East & Africa and North America have the lowest count of income sources among the high-income group, which indicates that there is a significant disparity in income sources and expenditures across different regions. It is important to create more opportunities for income generation and improve access to education, training, and resources to enable people to improve their income and standard of living (Lambers and Orejas, 2014).

The visualization effectively communicates the findings about the disparities in income sources and expenditures across different regions and income groups. It highlights the areas where there are significant difficulties faced by people, such as in South Asia where the count of income sources is low for all income groups. The perception likewise gives significant experiences into the areas where there are open doors for money age, like in Sub Saharan Africa and Latin America and the

Generally, this perception is an outcome in imparting complex data about pay gatherings and their kinds of revenue and consumptions in an unmistakable and reasonable manner. It successfully features the incongruities between various locales and pay gatherings and the requirement for approaches and projects to further develop admittance to schooling, preparing, and assets to empower individuals to work on their pay and way of life.


Read More

MIS602 IT Report

Task Instructions

Please read and examine carefully the attached MIS602_Assessment 2_Data Implementation_ Case study and then derive the SQL queries to return the required information. Provide SQL statements and the query output for the following:

1 List the total number of customers in the customers table.

2 List all the customers who live in any part of CLAYTON. List only the Customer ID, full name, date of birth and suburb.

3 List all the staff who have resigned.

4 Which plan gives the biggest data allowance?

5 List the customers who do not own a phone.

6 List the top two selling plans.

7 What brand of phone(s) is owned by the youngest customer?

8 From which number was the oldest call (the first call) was made?

9 Which tower received the highest number of connections?

10 List all the customerIDs of all customers having more than one mobile number.
Note: Only CustomerId should be displayed.

11 The company is considering changing the plan durations with 24 and 36 days to 30 days.
(a) How many customer will be effected? Show SQL to justify your answer.
(b) What SQL will be needed to update the database to reflect the upgrades?

12 List the staffId, full name and supervisor name of each staff member.

13 List all the phone numbers which have never made any calls. Show the query using:
i. Nested Query
ii. SQL Join

14 List the customer ID, customer name, phone number and the total number of hours the customer was on the phone during August of 2019 from each phone number the customer owns. Order the list from highest to the lowest number of hours.

15 i. Create a view that shows the popularity of each phone colour.

ii. Use this view in a query to determine the least popular phone colour.

16 List all the plans and total number of active phones on each plan.

17 List all the details of the oldest and youngest customer in postcode 3030.

18 Produce a list of customers sharing the same birthday.

19 Find at least 3 issues with the data and suggest ways to avoid these issues.

20 In not more 200 words, explain at least two ways to improve this database based on what you have learned in weeks 1-8.



A database management system for assignment help can be defined as the program that is mainly used for defining, developing, managing as well as controlling database access. A successful Information System gives precise, convenient and significant data to clients so that it very well may be utilized for a better decision-making process. The decision-making process should be founded on convenient and appropriate information and data so that the decisions can be based on the business objective. The role of DBMS in an information system is to minimize and eliminate data redundancy and also maximize data consistency (Saeed, 2017). In this assessment, the case study of a mobile phone company has been given. The phone as well as its plans are sold by employees to their clients with some specific features. The calls are charged on the basis of minutes in cents. The plan durations start from month. The main purpose of this assessment is to understand requirement for various data information requests from given database structure and develop SQL statements for the given queries.

Database Implementation

As the database is designed on the basis of the given ERD diagram, it's time to implement the database design. MySQL database has been used for implementing the database (Eze & Etus, 2014). The main reason for using MySQL for this database implementation is a free version is available on internet. This database engine is flexible with many programming languages. A good security is also provided with MySQL database (Susanto & Meiryani, 2019).

Entity Relationship Diagram

The given ERD diagram for this mobile phone company database is as following:

Implementation of Database for Mobile Phone Company

Database and Table Creation

Create schema Mobile_Phone_Company;
Use Mobile_Phone_Company;

Table Structure








Data Insertion








SQL Queries

1. Total number of customers
Select Count(CustomerID) from Customer;

2. Customers in CLAYTON
Select CustomerID, Concat(Given,' ',Surname) as FullName, DOB, Suburb from customer
Where suburb like 'CLAYTON';

3. Resigned Staff

Select * from staff where Resigned is not null ;

4. Biggest Data Allowance Plan

SELECT PlanNAme, BreakFee, Max(DataAllowance), MonthlyFee,
PlanDuration, CallCharge from PLAN ;

5. Customers who don’t have phone

SELECT CustomerID, CONCAT(Given,' ',Surname) AS FullName, DOB, Suburb
from Customer WHERE NOT EXISTS (Select CustomerID from Mobile
WHERE Mobile.CustomerID=Customer.CustomerID);

6. Top two selling plans

SELECT Mobile.PlanName, BreakFee, DataAllowance, MonthlyFee, PlanDuration, CallCharge,
FROM Mobile, Plan WHERE
Mobile.PlanName, BreakFee, DataAllowance, MonthlyFee, PlanDuration, CallCharge;

7. Brand owned by youngest customers

SELECT BrandName from mobile WHERE
CustomerID = (SELECT customerid From Customer where
Mobile.customerid=customer.customerid and
dob=(select max(dob) From Customer) );

8. The first call made by which number

SELECT mobile.phonenumber, calls.calldate
from mobile, calls where
calls.mobileid=mobile.mobileid and
calls.calldate=(select min(calldate) from calls);

9. Tower that received the highest number of connections

SELECT * from Tower WHERE
MaxConn=(Select Max(MaxConn) from Tower);

10. Customers who have more than one mobile number.

SELECT CustomerID from mobile
Group By CustomerID HAVING Count(PhoneNumber)>1;

11. (a) Number of customers affected

SELECT Count(CustomerID) from Mobile, plan where
mobile.planname=plan.planname and
planduration in(24,36);

(b) Update database
Update Plan set planduration=30
where planduration in (24,36);

12. Staff members

Select S1.StaffID, CONCAT(S1.Given,' ',S1.Surname) AS FullName,
CONCAT(S2.Given,' ',S2.Surname) AS SupervisorName From
Staff S1, Staff S2 where

13. Phone number which have not made any call

Nested Query
SELECT PhoneNumber from mobile
where not exists
(Select MobileID from calls where calls.mobileid=mobile.mobileid);

SQL Join

SELECT PhoneNumber from mobile Left Join
calls on calls.mobileid=mobile.mobileid
where not exists
(Select MobileID from calls where calls.mobileid=mobile.mobileid);

14. List the customer ID, customer name, phone number and the total number of hours the customer was on the phone during August of 2019 from each phone number the customer owns. Order the list from highest to the lowest number of hours.

select mobile.customerid, CONCAT(Customer.Given,'',Customer.Surname) AS FullName,
mobile.phonenumber, sum(calls.callduration) as NoOfHours, calls.calldate
from calls, mobile, customer where
calls.mobileid=mobile.mobileid and
mobile.customerid=customer.customerid and
month(calls.calldate)=8 and year(calls.calldate)=2019 group by
mobile.customerid, Customer.Given, Customer.Surname,
mobile.phonenumber, calls.callduration
Order by calls.callduration desc;

15. (i) View Creation

Create or Replace View view_color as Select PhoneColour, Count(MobileID) AS MobileNum From Mobile
Group By PhoneColour;

(ii) View Query

Select PhoneColour, Min(MobileNum) from view_color ;

16. Active phone plans

Select mobile.planname, count(mobile.phonenumber) from
mobile, plan where mobile.planname=plan.planname and
mobile.cancelled is not null
group by mobile.planname;

17. Oldest and youngest customer

Select * from customer where dob =
(select max(dob) from customer)


Select * from customer where dob =
(select min(dob) from customer);

18. Customers with same birthdays

select c.customerid, c.given, c.surname, c.DOB from customer c group by dob having count(dob)>1 order by dob;

Issues with the data

The main issues with the data are as following:

- The relationship in the staff table for defining a supervisor is complicated as it is self joined to maintain the relationship.
- The overall relationship between tower, plan, mobile and calls is very complicated.
- A clean data policy is not used for data insertion.

Ways to improve the database

As we know that the third normalized form has been used to define the database structure, but still some steps can be possible to improve this database. Its difficult to make a self join relationship in a single table, that is used in the Staff table. Hence, a different table can be used for making a list of supervisors. On the other hand, a mobile has a plan and make calls and the calls are made from the tower listed in the tower table.

Secondly, in order to secure this database, an authorized data access is required. It implies that only that much data could be retrieved that is required. The full data access privileges must be given to the administrator or a top management official who actual requires all data reports in order to make better decision making.


Read More

MIS500 Foundation of Information System Assignment Sample

Introduction to Chanel

Chanel is a French luxury fashion house that was founded by Coco Chanel in 1910. It focuses on women's high fashion and ready to wear clothes, luxury goods, perfumes and accessories. The company is currently privately owned by the Wertheimer family This is one of the last family businesses in the world of fashion and luxury with revenues of €12.8 billion (2019) and net income €2,14 billion (2019).

Chanel – Case Study To complete this assessment task you are required to design an information system for Chanel to assist with their business. You have discussed Porter’s Value Chain in class and you should understand the primary and support activities within businesses. For this assessment you need to concentrate on Marketing and Sales and how Chanel cannot survive without a Digital Strategy.

Source: Porter’s Value Chain Model (Rainer & Prince, 2019),
Read the Chanel case study which will be distributed in Class during Module 3.1. Visit these websites


Based on the information provided as well as your own research (reading!) into information systems write a report for Chanel to assist them in developing a ‘Digital Strategy’ to develop insights for their marketing and sales especially in online sales. Please structure the group report as follows:

- Title page

- Introduction

- Background to the issue you plan to solve

- Identify and articulate the case for a Digital Strategy at Chanel (based upon the data do you as a group of consultants agree or disagree)

- Research the issues at Chanel and present a literature review – discuss the marketing and sales data analysis needs and the range of BI systems available to meet these needs.

- Recommended Solution – explain the proposed Digital Strategy and information systems and how it will assist the business. You may use visuals to represent your ideas.

- Conclusion

- References (quality and correct method of presentation. You must have a minimum of 15 references)



Businesses are upgrading their business operations by implementing a digital strategy in order to compete against rivals and stay in business. In doing so, companies must continuously keep adjusting their business strategies and procedures to keep attracting the newer generation of customers or else face a certain doom. The paper is based on Chanel, a posh fashion brand based in Neuilly-sur-Seine, France. The Chanel’s business challenges in the market place are briefly assessed and examined in this research. In addition, the paper will briefly outline the advertising and marketing process, as well as how Chanel should embrace a digital strategy to maintain growth in the following decade.

Background to the issue you plan to solve

The issue is that the luxury brands, such as Chanel are lagging behind the rapidly developing trend of e-commerce and they need to implement a comprehensive Digital Strategy in order to keep their existing customers and expand their market shares. Traditionally, luxury brand companies considered online shopping as a platform for lower-end products and did not focus on investing in their social presence (Dauriz, Michetti, et al., 2014). However, the rapid development of online shopping platforms and changing behaviour of customers, coupled by lockdown measures and cross-border restrictions due to COVID-19 pandemic has exposed the importance of digital-based sales and marking even for the luxury brands which heavily depend on in-person retail sales (McKinsey & Company, 2021). Fashion experts warn that luxury companies will not survive the current crisis caused unless they invest in their digital transformation (Gonzalo et al., 2020).

According to the global e-commerce outlook report for assignment help carried out by the CBRE which is the world's largest commercial real estate services and investment firm, online retail sales accounted for 18 per cent of global retail sales in 2020 which is 140 per cent increase in the last five years and expected to reach 21.8 per cent in 2024 (CBRE, 2021). On the other hand, as digital technology advances, the customer's behavior is changing rapidly in a way that they do not only prefer to make their purchases online but also make a decision based on their online search (Dauriz, Michetti, et al., 2014). However, e-commerce accounted for only 4 per cent of luxury sales in 2014 (Dauriz, Remy, et al., 2014) and it reached just 8 per cent in 2020 (Achille et al., 2018). It shows that luxury brands are slow to adapt into the changing of environment of global trade and customers' behavior. On the other hand, at least 45 per cent of all luxury sales is influenced by the customers' social media experience and the company's digital marketing (Achille et al., 2018). However, almost 30 per cent of luxury sales are made by the tourists travelling outside their home countries, therefore luxury industry has adversely impacted by the current cross-border travel restrictions. In addition, fashion weeks and trade shows were disrupted by almost two years due to the pandemic. Therefore, fashion experts suggest luxury companies to enhance its digital engagement with their customers and to digitalize their supply chains (Achille & Zipser, 2020).

Chanel is the leading luxury brand for women's couture in the world. Its annual revenue is $2.54 billion which is one of the highest in the world (Statista, 2021). Chanel's digital presence is quite impressive. It is one of the "most pinned brands" on social media, having pinned by 1,244 times per day (Marketing Tips For Luxury Brands | Conundrum Media, n.d.). It has 57 million followers in social media and its posts are viewed by 300 million people in average (Smith, 2021). It has also been commended by some fashion experts for its "powerful narrative with good content" for marketing on social media (Smith, 2021). However, it has also been criticized for its poor websites that is not user-friendly (Taylor, 2021) and its reluctance on e-commerce (Interino, 2020). Therefore, Chanel needs to improve its digital presence, developing a comprehensive Digital Strategy.

Identify and articulate the case for a Digital Strategy at Chanel

Case for digital strategy at Chanel

As-Is State

After reviewing the Chanel case, as consultants, we are now all dissatisfied with the Chanel company's digital strategy. While making any kind of choice, businesses must first comprehend the customer's perspective. The current state of the firm's commerce was already determined based on the provided example, with the existing web-based platform employed by the company being fairly restrictive for them. For instance, in less than 9 nations, the company has built an eCommerce platform offering cosmetics and fragrance. The firm's internet penetration activity is lower than that of other industry players. The business only offers a restricted set of e-services offerings. Not only that, but the organisation uses many systems and databases in various geographical regions, which provides a disfranchised experience to the end consumers. Besides that, the company is encountering technological organisation issues, such as failing adequately align existing capabilities with forthcoming challenges and employing diverse models, all of which add to the business's complication. Simultaneously, their social media marketing is grossly insufficient, failing to reach the target luxury audience as it should.

To -Be State

Following an examination of the firm's present digital strategy, it was discovered that the company has a number of potential opportunities that it must pursue in order to effectively stay competitive in the market. The major goal of the Chanel firm, according to analysis and research, is to improve the customer experience, bring new consumers, establish brand connection and inspire advocacy, and raise product awareness . It has been determined that Chanel's digital strategy is outdated, as a result of which the company is unable to successfully compete with its competitors. Major competitors of Chanel, for example, used successful digital channels to offer products to end-customers throughout the epidemic. It is suggested that the organisation implement an information system that can provide customers with a personalised and engaging experience. To resolve the existing condition of business issues, it is critical for the organisation to incorporate advanced technology into its organizational processes in order to capture market share. The company’s existing state business challenges and implications can be remedied by upgrading their e-commerce website that is integrated with new scalable technologies such as AI, Big Data, Machine Learning, and analytics. The company also must optimize their product line and revaluate their core value proposition for new age luxury customers.

Literature Review

People have always been fascinated by stories, which are more easily remembered than facts. Well-told brand stories appear to have the potential to influence consumers' brand experiences which is "conceptualized as sensations, feelings, cognitions, and behavioral responses evoked by brand-related stimuli that are part of a brand's design and identity, packaging, communications, and environments" (Brakus et al, 2009 , p. 52). Story telling in a digital world is one of the effective ways to enables conversations between the brand and consumers. Chanel takes the advantage of digital marketing to communicate with their consumers via their website and social media the core value of the brand: Designer, visionary, artist, Gabrielle 'Coco' Chanel reinvented fashion by transcending its conventions, creating an uncomplicated luxury that changed women's lives forever. She followed no rules, epitomizing the very modern values of freedom, passion and feminine elegance (Chanel, 2005). For instance, the short film "Once Upon A Time..." by Karl Lagerfeld reveals Chanel's rebellious personality while showcasing her unique tailoring approach and use of unusual fabrics. Inside Chanel presenting a chronology of its history, how they transform from evolve from hats O fashion and became a leading luxury brand. No doubt Chanel has done an excellent job at narrating its culture, values, and identity, but the contents are mostly based on stories created by marketers or on the life of Coco Chanel. The brand image is static and homogeneous, and it is more like one-way communication, consumers cannot interact or participate in the brand's narrative.

Social media is more likely to serve as a source of information for luxury brands than as tool for relationship management. (Riley & Lacroix, 2003) Chanel was the most popular luxury brand on social media worldwide in April 2021, based on to the combined number of followers on their Facebook, Instagram, Twitter, and YouTube pages, with an audience of 81.4 million subscribers. (Statista, 2021) Chanel, as a prestigious a luxury brand, is taking an exclusive, even arrogant, stance on social media. It gives the audience multiple ways to access the valuable content they created while keeping them away from the comments generated by the content. The reasoning behind this approach is that Chanel wants to maintain consistency with their brand values and identity, which are associated with elegance, luxury, quality, detail attention, and a less-is-more approach. Nevertheless, social media can be a powerful tool that provide social data to better understand and engage customers, gain market insights, and deliver better customer service and build stronger customer relationships.

However, despite having the most social media followers, Chanel has the lowest Instagram interaction rate compared to Louis Vuitton, Gucci and Dior. Marketer and researcher increase social media marketing success rate by engaging with audience and consumers in real-time and collect audiences' big data for academic investigation. Put it in another way, social media engagement results in sales. It is imperative for in Chanel to not O just observe this model from afar, but actively challenge themselves to take advantage of it. To maintain their leadership In the luxury brand market, they must keep up with the constant changes in the digital world and marketplace and be more engaging with their audiences.

Chanel revenue has dropped dramatically from $12,273m to $10,108m (-17.6%) in 2020 due to the global pandemic where international travel has been suspended, boutique and manufacturing networks has been closed (Chanel, 2021). The pandemic has resulted in surge in e-commerce and accelerated digital transformation, hence, many of the luxury fashion brands pivot their business from retail to e-commerce, this includes Chanel's competitor Gucci and Dior. Chanel is gradually to adapting to digital strategy and selling their products online, but this is only for perfume and beauty products. President of Chanel Fashion and Chanel SAS, Bruno Pavlovsky said: "Today, e-commerce is a few clicks and products that are flat on a screen. There's no experience. No matter how hard we work, no matter how much we at look at what we can do, the experience is not at the level of what we want to offer our clients." (L. Guibault, 2021) In 2018, Chanel Fashion entered a cooperation with Farfetch purely for the purpose of developing initiatives to enhance in-store consumer experiences, they insist to incorporate human interaction and a personal touch when it comes to fashion sales. Experts foresee the pandemic could end 2022 but Covid may never completely vanish, and the life will never be same again. Consumer behaviour has changed during Covid, it will not follow a linear curve. Consumers will surge in e-commerce, reduce shopping frequency, and shift to stores closer to home. (S. Kohli et al, 2020) It is important to enhance digital engagement, but e-commerce is essential to maintain sales. It might not have a substantial impact to Chanel fashion sales in the past two year, but this will change with the advent of a new luxury consumer that wants high-quality personalised experiences offline and online. Chanel needs to adapt fast and demonstrate their trustworthiness by providing superior buying experience, exceptional customer service, and one one connections in store and on e-commerce platform.

Recommended Solution

1. Deliver the company culture using a more efficient strategy

The culture, value and the identity of Chanel is mainly from Coco Chanel. Although this is impressive, it is not attractive enough now for the newly emerging market. Chanel needs to deliver their unique culture in a more effective way. For example, Chanel could launch a campaign for all the customers to pay tribute to Coco Chanel. The customers could send their design to Chanel, of which they think is the most representative style of Coco Chanel. This could encourage more customers to be curious about the culture and stories behind this brand, instead of telling the story in a one-way communication. Especially during such an information-boosting time, the unique long-time culture is not that useful to attract more customers. unless it is used in a way that suits with the current purchasing habits of customers. According to Robert R (2006), it's wiser to create value with the customers instead of using customers, converting them from defectors to fans is more likely to happen when they are bonded with this brand. Moreover, Chanel used to focus more on the in-site retail experience, which might be part of their culture, since Chanel is a traditional luxury brand. However, people are more used to online shopping nowadays, and this is the trend. Chanel needs to invest more on the online service to exhibit their culture and adapt to the current habits of the consumers. The website of Chanel is fancy, with nice colors and visuals, but it's almost impossible for a customer to find out what they are looking for. The stylish website cannot be converted directly into revenue, they should make their website more user-friendly and functional. This is not hard for such a huge company if they realize this issue.

2. Bond with the customers

Chanel used have the largest number of followers on social media but fall behind Gucci and LV's in the past few years, because they pushed to much content without enough interactions. Chanel needs to create more bond with their existing customers and potential customers. The communication between Chanel and the customers seems to be one-way in the past. Consumers receive the messages from Chanel whereas have no channel to explain what they think about the brand and what they need from the brand. Therefore, Chanel should build a closer a relationship with their customers through social media. The reasons why using social media as a channel are as follow. a Firstly, this is a more cost-effective method to get accessed to a huge market. Chanel could let more people know about their changing and the newest product through it. They could also post different advertising to different selected customer base. Secondly, social media establish a platform where Chanel could listen to the real need of the customers. Many customers think they need a platform to let the brand know what they need and hope to witness the changes from the brand (Grant L, 2014). A successful brand should let the customers believe that what they think matters, although there IS no need CO adapt all the preference of the customers, Chanel needs to show their attitude that the company treasure the relationship with their customers. Finally, failing to use social media platform could lead to huge loss of the market. While other brands are posting advertisements and communicating with customers. they are stealing the customers from Chanel. Chanel needs to take the same weapon to defense. In conclusion, to let the customers engaged into the project and communications with the brand could assist with establishing a long-term relationship with customers and increase the loyalty of them.

3. Optimize the product line of the online store

Ecommerce market has been increasing amazingly in the past few years, especially because of the Covid, people are more used to online shopping. Therefore. Chanel needs to optimize their product line of the online store. bring their fashion line online and meet more demand of their customers. Although the offline shopping experience of luxury brands has significant value, to provide an extra choice could also be impressive. Because customers are more informed and demanding the brand to solve their problem and deliver unforgettable shopping experience. There is one field that Chanel could invest IS the VR / AR fitting room. There might be some reasons that customers cannot go to the retail shop or there is no suitable size of the product. A VR / AR fitting room enable the customers to try various products online to choose their favorite one. It could be more efficient since they could do it anywhere and anytime they want. At the same time, if they do not mind sharing the detailed information, the VR fitting room could generate a model for the client, and it is more visualized. This could increase the shopping experience and attract more potential customers. On the other hand, Chanel could give different levels of permit for different customer base. This could help to keep the company culture to provide the best service for high-net-worth clients. People could increase their customer level by building up the purchase history. In summary, to bring a unique online shopping experience to customers could enable Chanel to take up more market and establish a better platform for further development.


This report studied the case of Chanel and analyzed the problems which they were suffering from. It studied all the issues that were present in their organization and found that they had lost their unique value proposition along the way and also lagged behind in Social Media as well as web presence. Moreover, the firm's existing Commerce platform has a lot of weaknesses that have a negative impact on the company's core continuity as well as market survival. Accordingly after a careful analysis, few strategies were suggested so that the company can fix their social media, their digital presence and how they target the new breed of luxury customers of today.


Read More

DATA4300 Data security and Ethics Report Sample

Part A: Introduction and use on monetisation

- Introduce the idea of monetisation.
- Describe how it is being used in by the company you chose.
- Explain how it is providing benefit for the business you chose.

Part B: Ethical, privacy and legal issues

- Research and highlight possible threats to customer privacy and possible ethical and legal issues arising from the monetisation process.
- Provide one organisation which could provide legal or ethical advice.

Part C: GVV and code of conduct

- Now suppose that you are working for the company you chose as your case study. You observe that one of your colleagues is doing something novel for the company, however at the

same time taking advantage of the monetisation for themself. You want to report the misconduct. Describe how giving voice to values can help you in this situation.

- Research the idea of a code of conduct and explain how it could provide clarity in this situation.

Part D: References and structure

- Include a minimum of five references
- Use the Harvard referencing style
- Use appropriate headings and paragraphs


Introduction and use of Monetization

Idea of Monetization

According to McKinsey & Co., the most successful and fastest-growing firms have embraced data monetization as well as made it an integral component of their strategy. There are two ways one can sell direct access to the data to 3rd parties through direct data monetization. There are two ways to sell it: one can either sell the accrued data in its raw form, or one can sell it all in the form of analysis as well as insights. Data for assignment help can help one find out how to get in touch with their customers and learn about their habits so that one can increase sales. It is possible to identify where as well as how to cut costs, avoid risk, and increase operational efficiency using data. (For the given case, the chosen industry is Social Media (Faroukhi et al., 2020).

How it is being used in the chosen organization

In order for Facebook to monetize its user data, they must first amass a large number of data points. This includes information on who we communicate with, what we consume and react to, as well as which websites and apps we visit outside of Facebook. Many additional data points are collected by Facebook beyond these (Mehta et al., 2021). Because of the predictive potential of ml algorithms, companies can accomplish that even if users don't explicitly reveal it themselves. The intelligence gathered based on behavioural tracking done is the essence of what is sold to their customers (Child and Starcher, 2016). Facebook generates 98 percent of its income from advertising which is how their data is put to use.

Providing benefits to the organization chosen

Facebook's clients (advertisers and companies, not users) receive a plethora of advantages. Advertising may target certain groups of people based on this information and change the message based about what actually works with them. Over ten million businesses, mostly small ones, make use of Facebook's advertising platform. As a result of the Facebook Ad platform, they are able to present targeted consumers the advertising, as well as provide thorough performance data on how various campaigns including different visuals performed (Gilbert, 2018).

Ethical, Privacy and legal Issues

Threats to consumers

According to reports, Facebook has been well-known for using cookies, social plug-ins, and pixels to monitor users as well as non-users of Fb. Even if users don't have a Facebook account, they aren't safe from this research because there are a slew of other data sources that may be used in place of Facebook. It's also possible to monitor non-members who haven't joined Facebook by visiting any website that features the Facebook logo. In addition to "cookies," web beacons were one of the numerous kinds of internet tracking that may be employed across websites, and then these entries could be sold to relevant stakeholders.As a result, target voters might discover reinforcing messages on a wide range of sites without understanding that they are the only ones receiving such communications, nor are they given cautions that these are political campaign ads.Furthermore, governments throughout Europe and north America are increasingly requesting Facebook to hand up user data to assist them investigate crimes, establish motivations, confirm or refute alibis and uncover conversations. The word "fighting terrorism" has become a catch-all phrase that has lost its meaning over time. According to Facebook, this policy is referred to as, "We may also share information when we have a good faith belief it is necessary to prevent fraud or other illegal activity, [or] to prevent imminent bodily harm [...] This may include sharing information with other companies, lawyers, courts, or other government entities."(Facebook, 2021). In essence, privacy is mandated only on face value whereas the data is exposed to both Facebook, 3rd party advertisers and Government.

IAPP can help with the privacy situation

International Association for the Protection of Personal Information (IAPP) is a global leader in privacy, fostering discussions, debates, and collaboration among major industry stakeholders. They help professionals and organisations understand the intricacies of the growing environment as well as how to identify and handle privacy problems while providing tools for practitioners to improve and progress their careers (CPO, 2021). International Association of Professionals in Personal Information Security provides networking, education, and certification for privacy professionals. The International Association for the Protection of Personal Information (IAPP) can play a role in promoting the need for skilled privacy specialists to satisfy the growing expectations of businesses and organisations that manage data.

GVV and Code of Conduct

Fictional scenario

For the sake of a fictionalised context, I would assume that I was employed by Facebook. Accordingly, my colleague in this same fictionalised setting is invading privacy of businesses in a particular domain and collecting proprietary information based on the data collected and then selling it off to the competitors of that business in the same domain. There are indeed a lot of grey areas to contemplate and traverse when it comes to dealing with these kinds of tricky situations, and managing them professionally and without overreacting is essential. The most critical thing for myself would have been to figure out what is genuine ethical problem what is just something I don't like before I get involved. If my concerns are well-founded and the possible breach is significant, I'd next ask myself two fundamental questions: I can proceed if both of the following questions are answered with a resounding "yes.”

Next, when someone is working for a publicly traded and that being a significantly large company, there should be defined regulations and processes to follow whenever one detects an unlawful or unethical violation. In the internal employee compliance manual, one ought to be able to find these. Further ahead, I'll decide whether to notify their supervisor. If that person is also complicit in the event, then next alternative would be to inform the reporting manager or compliance officer. Also, if I choose note be involved in the investigation or reporting,I can either report anonymously or mention the superiors that I would not like to be named.

Reference list

Read More

ISYS1003 Cybersecurity Management Report Sample

Task Description

Since your previous two major milestones were delivered you have grown more confident in the CISO chair, and the Norman Joe organisation has continued to experience great success and extraordinary growth due to an increased demand in e-commerce and online shopping in COVID-19.

The company has now formalised an acquisition of a specialised “research and development” (R&D) group specialising in e-commerce software development. The entity is a small but very successful software start-up. However, it is infamous for its very “flexible” work practices and you have some concerns about its security.

As a result of this development in your company, you decide you will prepare and plan to conduct a penetration test (pentest) of the newly acquired business. As a knowledgeable CISO you wish to initially save costs by conduct the initial pentest yourself. You will need to formulate a plan based on some industry standard steps.

Based on the advice by Chief Information Officer (CIO) and Chief Information Security Officer (CISO) the Board has concluded that they should ensure that the key services such as the web portal should be able to recover from major incidents in less than 20 minutes while other services can be up and running in less than 1 hour. In case of a disaster, they should be able to have the Web portal and payroll system fully functional in less than 2 days.


1. Carefully read the Case Study scenario document. You may use information provided in the case study but do not simply just copy and paste information.

2. This will result in poor grades. Well researched and high-quality external content will be necessary for you to succeed in this assignment.

3. Align all Assignment 3 items to those in previous assignments as the next stage of a comprehensive Cyber Security Management program.

You need to perform a vulnerability assessment and Business Impact Analysis (BIA) exercise:

1. Perform vulnerability assessment and testing to assess a fictional business Information system.
2. Perform BIA in the given scenario.
3. Communicate the results to the management.



Another name of one test is a penetration test and this type of test is used for checking exploitable vulnerabilities that are used for cyber-attacks [20]. The main reason for using penetration tests is to give security to any organization. For assignment help This test shows the way to examine the policies are secure or not [14]. This type of test is very much effective for any organization and the demand for penetration tests is increasing day by day.

Proposed analytical process

A penetration test is very much effective for securing any type of website [1]. Five steps are connected with pentest. The name steps are planning, scanning, gaining access, maintaining process, and analysis.

Pentest is based on different types of processes. Five steps are involved in pentest [2]. The first step shows the planning of pentest, the second step describes the scanning process, the third step is about gaining access, the fourth step and five steps ate about maintaining and analyzing the process.

There are five types of methods that are used for penetration testing and the name of the methods are NIST, ISSAF, PTES, OWASP, and OSSTMM. In this segment, open web application security project or OWASPO is used [3]. The main reason for selecting this type of method is that it helps recognize vulnerabilities in mobile and web applications and to discover flaws in development practices [15]. This type of test performs as a tester and it rate risks that help save time. Different types of box testing are used in pentest. The black box testing is used when the internal structure of any application is completely unknown [16,17]. White test is used when the internal process of working instruction is known and a gray structure is used when the tester can understand partially the internal working structure [13].

Ethical Considerations

The penetration test is used to find malicious content, risks, flows, and vulnerabilities [4]. It helps to increase the confidence of the company and there are different types of process that helps to increase the productivity and the performance of the company. The data that are used may be restored with the help of a pen test.

Resources Required

The name of hardware components that are used for performing ten tests is a port scanner, password checker, and many more [5]. The names of the software that are used for the penetration test are zmap, hashcat, PowerShell-suite, and many more.
Time frame

This framework has a huge user community and there are no articles, techniques, and technologies are used for this type of testing. The OWASP process is time-saving so it is helpful in every step [19].

Question 3.1

1. Analysis of Norman Joe before the BIA implementation

Business impact analysis is the process of identifying and evaluating different types of potential effects [19]. These potential effects can be in different fields and this is helpful to overcome all of the range requirements for business purposes [6]. The main aspect of pentest to secure all of the web and mobile is to provide and identify the main weakness or the vulnerabilities of the business management system from being the victim of major reputation and financial losses. To ensure the continuity of the business, regular checking and penetration testing is very important for the company [12].BIA is a very important process for Norman Joe, before implementing the BIA Norman Joe has many security issues, and the company is also required to improve the firewall in their network system as well as the IDS [11]. As the firewalls are only developed to prevent attacks from the outside of the network, the attacks from inside the network can easily harm the network and damage the workflow [7]. The company requires to implement internal firewalls to prevent such attacks. Firewalls also can be overloaded by DDos protocol attacks, for this type of attack the company requires to implement scrubbing services [16].


Figure 1: Before implantation of BIA for penetration testing

2. Analysis of Norman Joe after the BIA implementation

The process of business impact analysishas been done on the Norman Joe to secure the Company's system and after implementing the security measures such as the internal firewalls and the scrubbing services, the company’s data has been secure mostly from cyber security threats [8]. After implementing the BIA, the website has been tested by running the website, the website has first started and then intercept of the website has been done [10].

Figure 2: After implantation of BIA for penetration testing

After the intercept process it has been checked if the website is being used or not [11], if the website is not being used it allows the user to remain in the start page of the website and if the website is being used the protocols are being found and checked if it was used or using then the information are gathered and performed the penetration test in the system [9]. Furthermore, the report of the penetration analysis has been displayed after the test as well as the vulnerability level then the analysis has been finished.

Reference List

Read More

COMP1680 Cloud Computing Coursework Report Sample

Detailed Specification

This Coursework is to be completed individually Parallel processing using cloud computing

The company you work for is looking at investing in a new system for running parallel code, both MPI and Open MP. They are considering either using a cloud computing platform or purchasing their own HPC equipment. You are required to write a short report analyzing the different platforms and detailing your recommendations. The report will go to both the Head of IT and the Head of Finance and so the report should be aimed at a non-specialist audience. Assume the company is a medium sized consultancy with around 50 consultants, who will likely utilize an average of 800 CPU hours each a month. Your report should include:

1) A definition of cloud computing and how it can be beneficial.
2) An analysis of the advantages and disadvantages of the different commercial platforms over a traditional HPC.
3) A cost analysis, assume any on site HPC will likely need dedicated IT support
4) Your recommendations to the company.
5) References



The report is focusing to help the company to invest in a new system so that the company can run parallel code for OpenMP and MPI. The company is looking forward to use a cloud computing platform or to purchase separate HPC equipment for the company. At the very beginning of the report, the meaning of cloud computing is mentioned. In the same part, the definition of cloud computing is also mentioned. For Assignment Help All of the possible ways how cloud computing is useful and can it benefit a company are also written in the report. The report shows how the High-performance Computing act and how other commercial platform works. A comparison of both platforms has been done in the report. In the next section of the report, an analysis has been done about HPC and different commercial platforms. It has stated the different advantages and disadvantages of using different commercial platforms over the traditional HPC. All the points that show why the company should go with the particular platform have been highlighted in the given report. In the next section of the report, cost analysis has been mentioned. The cost analysis is done on the basis of an assumption made. A site has been assumed that the HPC will likely need dedicated IT support. After analyzing all these points, some of the recommendations to the company have been given. The recommendation that is given to the company will help the company to choose whether it should go with High-Performance Computing or it should choose a commercial platform. It will give the company an idea to invest in the new system for running parallel code. At the end of the report, a short conclusion has also been mentioned that summarizes each point presented in the report in short form.

Definition of Cloud Computing

Cloud computing is referred to as the delivery of service on demand like storage of data, computing power, and computer system resources. The delivery of services using internet, databases, networking, data storage, software, and servers. is termed cloud computing. The name of this on-demand availability is cloud computing because the information and data accessed by it are found remotely in the virtual space or in the cloud. All the heavy lifting or activities involved in the processing or crunching of the data in the computer device are removed by cloud computing (Al-Hujran et al. 2018). Cloud computing moves away from all the work to huge computer clusters very far away in cyberspace. The definition of cloud computing is, it is a general term for anything that involved delivering hosted service with the help of the internet. Both the software and hardware components are involved in the cloud infrastructure and these are required for implementing the proper cloud computing model. Cloud computing can be thought of in a different way such as on-demand computing or utility computing. Hence, in short, it can be said that cloud computing is the delivery of resources of information technology with the use of the internet (Arunarani et al. 2019).

Following are the points that show how cloud computing can be beneficial:

Agility- Easy access to a large number of technologies is given by the cloud so that the user can develop anything that the user has ever imagined. All the available resources can be quickly spined up to deliver the desired result from infrastructure services such as storage, database, computer, IoT, data lakes, analytics, machine learning, and many more. All the technologies services can be deployed in a matter of seconds and the ways to implement various magnitude. With the help of cloud computing, a company can test several new ideas and experiments to make differentiation of customer experiences, and it can transform the business too (Kollolu 2020).

Elasticity- With the presence of cloud computing in any business enterprise, the system become capable to adapt all the changes related to workload by deprovisioning and provisioning in autonomic manner. With cloud computing, these resources can diminished and can be maximize it instantly or to shrink the capacity of the business.

Security- Security of the data is one thing that almost every organization is concerned about. But, with the help of cloud computing, one can keep all the data and information very private and safe (Younge et al. 2017). A cloud host carefully monitors the security that is more important than a conventional in-house system. In cloud computing, a user can set different security according to the need.

Quality control- In a cloud-based system, a user can keep all the documents in a single format and in a single place. Data consistency, avoidance of human error, and risk of data attack can be avoided in cloud computing. If the data and information are recorded in the cloud-based system then it will show a clear record of the updates or any revisions made in the data. On the other hand, if the data are being kept in silos then there will be a chance of saving the documents in different versions that can lead to diluted data and confusion.

Analysis of Different Platforms vs HPC

HPC stands for High-Performance Computing is the ability to perform complex calculations and process data at a very high speed. The supercomputer is one of the best-known solutions for high-performance computing. It consists of a large number of computer nodes that work together at the same time to complete one or more tasks. This processing of multiple computers at a time is called parallel processing. Compute, network and storage are the three components of HPC solution. In general terms, to aggregate all the computing powers in such a way that it could deliver high performance than one could get out of a typical desktop is determined as HPC. Besides, having so many advantages, it has some disadvantages also. The following points show the analysis of the advantages and disadvantages of the different commercial platforms over a traditional HPC.

The advantage of different platforms over a traditional HPC is as follows:-

- With the perspective of the cost of equipment of high-performance computing, it seems to have been very high. The cost of using the high-performance computing cluster is not fixed and varies according to the type of cloud instances that the user uses to create the HPC cluster. If the cluster is needed for a very short time, the user uses on-demand instances for creating the cluster after the creation of the cluster, the instances are needed to be deleted (Tabrizchi and Rafsanjani 2020). This cost is almost five times higher than that of the local cluster. But when moving to the other platform of cloud computing, reduces the cost of managing and maintaining of IT system. in the different platforms, there is no need to buy any equipment for the business. The user can reduce the cost of using the cloud computing service provider resources. This is one of the benefits of other platforms over traditional HPC.

- In the different platforms of cloud computing rather than traditional HPC, cloud computing permits the user to be more flexible in their performance and work practices. For instance, the user is able to access the data from home, commute to and from the work, or on holiday. If the user needs access to data in case the user is off-site, the user can connect to the data anytime very easily, and quickly. In contrast, there is no such case with traditional HPC. In the traditional HPC, the user has to reach the system to access the data. In this HPC cloud, it is difficult to move the data that are used in HPC (Varghese and Buyya 2018).

- Having separate clusters in the cloud in the data centers poses some security challenges. But in the other platform of cloud computing, there is no such risk associated with data. One can keep and store the data in the system safely and privately in the case of other platforms. The user does not need to worry in any situation such as natural disaster, power failure or in any difficulties, the data are stored very safely in the system. This is another advantage of other platforms over traditional HPC (Mamun et al. 2021).

Disadvantages of other platforms over traditional HPC are as follows:

- The cloud, in any other setup, has the chance to experience some of the technical problems like network outages, downtimes, and reboots. These activities created troubles for the business by incapacitating the business operations and processes and it leads to damaging of business.

- In the several other platforms of cloud computing rather than HPC, the cloud service provider monitors own and manages the cloud infrastructure. The customer doesn't get complete access to it and has the least of the control over the cloud infrastructure. There is no option to get access to some of the administrative tasks such as managing firmware, updating firmware, or accessing the server shell.

- Not all the features are found in the platforms of cloud computing (Bidgoli 2018). Every cloud services are different from each other. There are some cloud providers that tend to provide limited versions and there are some cloud providers that offer only the popular features so the user does not get every customization or feature according to the need.

- Another disadvantage of cloud computing platforms over high-performance computing is that all the information and data are being handed over by the user while moving the services to the cloud. For all the companies that have an in-house IT staff, they are not able to handle the issues on their own (Namasudra 2021).

- Downtime is one of the disadvantages of cloud-based service. There are some experts that have considered downtime as the biggest cons of cloud computing. It is very well known that cloud computing is based on the internet and so there is always the chance of service outrage due to unfortunate reasons (Kumar 2018).

- All the components that remain online in cloud computing that exposes potential vulnerabilities. There are so many companies that have suffered from severe data attacks and security breaches.

The above sections deal with the analyses of other commercial factors over traditional higher performance computing.

Cost Analysis

It is very important to know how the cloud provider sets the prices of the services. A team of cost analytics has been referred by the company so that the team can calculate the total cost that the company has to endure to set the cloud-based platform. The team will decide which cost is to be taken into consideration and which is to be eliminated while calculating the cost. In the cost analysis site, HPC will need dedicated IT support. Network, compute, and storage are the three different cost centers of the cloud environment. Below are the points that show the cost analysis of the cloud service and an idea has been provided about how the cloud providers decide how much to charge from the user.

Network- While setting the price of the service, the expenses to maintain the network are determined by the cloud providers. The expenses of maintaining the network include calculation of the costs for maintaining network infrastructure, the cost of network hardware, and the labor cost. All these costs are summed up by the provider and the result is divided by the rack units needed by the business for the Infrastructure as a Service cloud (Netto et al. 2018).

- Maintainance of network infrastructure- the cost of security tools, like firewalls, patch panels, LAN switching, load balancers, uplinks, and routings are included in this. It covers all the infrastructure that helps the network to run smoothly.

- Cost of network hardware- every service provider needs to make its investment in some type of network hardware. The providers buy hardware and charge the depreciation cost over the device lifecycle

- Labor cost- labor cost includes the cost of maintaining, monitoring, and managing the troubleshoot cloud computing infrastructure (Zhou et al. 2017).
Compute- every business enterprises have their own requirement that includes CPU. Cost of CPU is calculated by the service providers by making an determination of the cost of per GB RAM endured by the company.

- Hardware acquisition- hardware acquisition computation stated the cost of acquiring hardware for every GB of RAM that the user will use. Depreciation of the cost is also made here over the lifecycle of the hardware.

- Hardware operation- total cost of the RAM is considered by the provider in the public cloud and then the cost per rack unit of the hardware divide it. This cost includes subscription costs based on usage and licensing.

Storage- storage costs are as same as the compute cost. In this, the service provider finds out what is the cost of operating the storage hardware and to get the new hardware as per the storage need of the users (Thoman et al. 2018).


As the company is looking forward to investing in the new system for running parallel codes, some of the recommendations to the company are mentioned in the below section. These recommendations will help the company to decide in which of the system, it needs to make its investment so that the company can run parallel codes smoothly. On the basis of analyses done in the above part of the report, it can be said that the company should go with the different platforms available in cloud computing. In the traditional HPC, there are lots of barriers to the users like high cost, the problem of moving and storing the data, and many more. It is not like cloud computing is all set for the company to use and so the following are the recommendations for the company that can enhance all the other platforms of cloud computing:

Recommendations to minimize planned downtime in the cloud environment:

- The company should design the new system's device services with disaster recovery and high availability. A disaster implemented plan should be defined and implemented in line with the objectives of the company that provides the recovery points objectives and lowest possible recovery time.

- The company should leverage the different zones that are provided by the cloud vendors. To get the service of high tolerance, it is recommended to the company consider different region deployments attached to automated failover in order to ensure the continuation of the business.

- Dedicated connectivity should be implemented like AWS Direct Connect, Partner Interconnect, Google Cloud’s Dedicated Interconnect as they provide the dedicated connection of network between the user and the point of cloud service. This will help the business to exposure the risk of business interruption from the public internet.

Recommendations to the company with respect to the security of data:

- The company is recommended to understand the shared responsibility model of cloud providers. Security should be implemented in every step of the deployment. The company should know who holds access to each resource and data of the company and is required to limit the access to those who are least privileged.

- It is recommended to the company implement multiple authentications for all the accounts that provide the access to systems and sensitive data. The company should turn on every possible encryption. A risk-based strategy should be adopted by the company that secured all the assets assisted in the cloud and extended the security to the devices.

Recommendations to the company to reduce the cost of the new cloud-based system:

- The company should get ensured about the presence of options of UP and DOWN.

- If the usage of the company is minimum, then it should take the advantage of pre-pay and reserved instances. The company can also automate the process to stop and start the instances so that it can save money when the system is not in use. To track cloud spending, the company can also create an alert.
Recommendations to the company to maintain flexibility in the system:

- The company should consider a cloud provider so that it can take help for implementing, supporting, and running the cloud services. It is also necessary to understand the responsibilities of the vendor in the shared responsibility model in order to reduce the chance of error and omission.

- The company must understand the SLA as it concerns the services and infrastructure that the company is going to use and before developing the new system the company should understand all the impacts that will fall on the existing customers.

Following all the above-mentioned recommendations, the company can decide how to and where to invest in the development of the new system.


The report was prepared with the goal to help the company to decide on which system should it make the investment in. The company has two options, either it can use a cloud computing platform or it can purchase all the equipment of HPC. In this report, the meaning of cloud computing has been explained very well. Not only the meaning of cloud computing, but it has also focused on the benefits of cloud computing in any business organization. In simple term, the delivery of a product or service with the use of the internet is called cloud computing. In the report, an analysis has been made regarding both of the systems. The advantages and disadvantages of the other platforms over the platform of high-performance computing are included in the analysis so that it can be easy to decide where the company should make its investment. For the comparison, different bases are taken as the cost of the system, security of data, control over the access, and many more. A structure of cost analysis has also been presented in the report so that company can imagine how much cost it has to invest in the new system. some of the recommendations to the company are also given. The company is recommended to choose the cloud computing platform as it is very secure and the cost of setting up the system is lower in comparison to others.



Read More


Task Summary

Reflecting on your initial report (A2), the organisation has decided to continue to employ you for the next phase: risk analysis and development of the mitigation plan.

The organisation has become aware that the Australia Government (AG) has developed strict privacy requirements for business. The company wishes you to produce a brief summary of these based on real- world Australian government requirements (similar to how you used real-world information in A2 for the real-world attack).

These include the Australian Privacy Policies (APPs) especially the requirements on notifiable data breaches. The APP wants you to examine these requirements and advise them on their legal requirements. Also ensure that your threat list includes attacks on customer data breaches. The company wishes to know if the GDPR applies to them. The word count for this assessment is 2,500 words (±10%), not counting tables or figures. Tables and figures must be captioned (labelled) and referred to by caption. Caution: Items without a caption may be treated as if they are not in the report. Be careful not to use up word count discussing cybersecurity basics. This is not an exercise in summarizing your class notes, and such material will not count towards marks. You can cover theory outside the classes.


Assessment 3 (A3) is in many ways a continuation of A2. You will start with the threat list from A2, although feel free to make changes to the threat list if it is not suitable for A3. You may need to include threats related to privacy concerns. Beginning with the threat list:

- You need to align threats/vulnerabilities, as much as possible, with controls.
- Perform a risk analysis and determine controls to be employed.
- Combine the controls into a project of mitigation.
- Give advice on the need for ongoing cybersecurity, after your main mitigation steps.


- You must use the risk matrix approach covered in classes. Remember risk = likelihood x consequence.

- You should show evidence of gathering data on likelihood, and consequence, for each threat identified. You should briefly explain how this was done.

- At least one of the risks must be so trivial and/or expensive to control that you decide not to use it (in other words, in this case, accept the risk). At least one of the risks, but obviously not all.

- Provide cost estimates for the controls, including policy or training controls. You can make up these values but try to justify at least one of the costs (if possible, use links to justify costs).



A mitigation plan is a method where has a risk factored that helps to progress action and various options. Therefore, it also helps to provide opportunities and decreases the threat factors to project objectives. In the section, the researcher is going to discuss threat analysis using matrix methods, threats and controls also mitigation schemes. For Assignment Help, thread model refers to a structural representation of the collected data based on the application security. Essentially, it is a perception of different applications as well as their environment in terms of security. On the other hand, it can be said that the thread model is a process of structure that mainly focused on the potential scheme of the security of threats as well as vulnerabilities. Apart from that, the threat model includes the quality of seriousness of each thread that is identified in this industry. Besides that, it also ensures the particular techniques which can be used for mitigating these issues or threads. Threat modeling has several significant steps which must be followed for mitigating the threads in cybercrimes.

Body of the Report

Threat Analysis

The threat is a system that is generally used for determining the components of the systems. There have highly needed to protect data and various types of security threats. The threat analysis is affected to identify information and several physical assets of different organizations. The organization should understand the powerful threats as organizational assets that enhance the mitigation plan for threat reports (Dias et al. 2019).

The various organizations determine the effects of economical losses using qualitative and quantitative threat analysis. The threat analysis assures potential readiness which has a crucial risk factor to process any project. There have some important steps in threat analysis such as recognizing the cause of risk factors or threats. After that, categorize the threats and make a profile that is community-based. The third step is determining the weaknesses after that makes some scenarios along with applying them. Finally, it is making a plan for emergency cases.

Threat analysis is mainly followed by risk matrix concepts for carrying forwarding the mitigation plan for a research report. There have four types of mitigation strategies such as acceptance, transformation, limitation, and risk factor avoidances (Allodi & Massacci, 2017).


Table 1: Risk matrix methods
(Source: Self-created)

Cyber Hacking

The hacker hacks data on the food company JBS. The food company is one of the largest meat and processing food organization in Australia. For this reason, it is a crucial issue in Australia, So that the authority of the company is worried about cyber hacking. Moreover, it is criminal behavior according to the company. Therefore, it takes a major time almost four months to mitigate the condition. Moreover, it is a threat for t5he JBS Food Company.

Data Leakage

Leaking data is a very basic challenge and issue for the food company. It deteriorates the services of the food company. The inner employees are related to this type of activity. The company cannot keep faith in the employees who work s these types of activities. This is a crucial threat for the company that needs to fix quickly so that the company can survive from this type of activity (Scully, 2011). Moreover, it is a misunderstanding feature between the authority and the employees. Therefore, it takes 25 days to fix all issues to mitigate the condition of the food company.

Insider Threat

There have a very high chances to leak data that are done from the employees of the food company JBS. It is an inner threat that continues to carry forward more or less or several times. Insider threats can damage the inner cultures of the company where employees and management both are suffered due to the data leaking processes. Sometimes it is a company's failure so that the management cannot handle the entire capability or bonding of the company. Therefore, it takes adequate time almost 2 months to mitigate the condition. However, it sometimes could not be controlled by the authorities.


Phishing is a secret code or sensitive information that should be hidden from entire workers of the food company FBS in Australia. Moreover, it is a trustworthy contact that needs to hide for securing information about the largest food company in Australia. There are chances of high risks in the systems. So that it takes 65 days to mitigate the condition of the company.

Threats and Controls

“Recent research on the usability of security technologies — often termed HCISEC (joining human-computer interaction with security) — tries to put humans in the loop and views usability as a key component for both accepting security technologies and using them correctly” (Wagner et al., 2019). There have major threats in the mitigation plan that needs to be controlled for balancing the inner condition of the company FBS foods company in Australia. Providing Cyber security to keep secure the data or information is the main motive of the company. Data tempering, information disclosures, and repudiation threats are major parts of cyber security. Data tempering is generally used for exposing data or information of the food company FBS. Data tampering is mainly noticed as the risk factor so that it can help to delete all the files which have various details as a document. Data tampering is one of the major cyber threats that can leak private and sensitive information to third parties.

It is an unauthorized and international act that needs to be eradicated by data scientists as soon as possible. It can change the entire pattern of a dataset. It can also delete some important files and accuse anomalies in those important datasets. Hackers can eavesdrop while any important conversions are going on by applying this method. It has caused major problems in large-scale business organizations. The major risk that involves data tampering is that any important message can get altered by filters and the useful information which is present in that message can get deleted by third parties (Ivanov & Dolgui, 2020).

Information disclosure which is known as information leakage is one of the major issues that can cause cyber attacks (Oosthoek & Doerr, 2021). It can intentionally reveal sensitive information to the users of any social media platform. It can hamper the privacy of a person. It can leak information to the hackers and that can cause major troubles for an organization or for a person as well. It can disclose financial information to potential hackers and that can be a severe issue. So everyone needs to be aware of using a website before putting any kind of information in it. A repudiation threat may happen when the user does not have a proper adoption in controlling the log-in and log-off actions. It can cause data manipulation and that can cause severe problems for a person or for an organization as well. Forging the users to take new actions so they can easily make the log-based activities can also be caused by repudiation threats. For example, it can be said if a user tries to use some illegal actions to check the disability of their system then that can be problematic and can be counted as a cyber attack.

Business impacts analysis is a very crucial part of controlling risk factors or challenges on behalf of the company. It is beneficial for the food company FBS who secures their issues via the concepts in mitigation threat plans. On the other hand, the company needs to maintain strategies so that the management can recover from the various challenges that face the risk threat of a mitigation plan. A recovery plan works as a backup plan that fixes the entire challenges of controls various issues in risk threat management of mitigation plans. Recovery exercises play a great role in recovering from such conditions. Therefore, third-party suppliers sometimes help to control these types of issues in risk threat management. Although the company needs various times to control the condition so that the management can maintain several kinds of challenges that arise in the company due to various reasons. The food company needs to use advanced technologies or various policies so that it can control all threats in mitigation plans (Gius et al. 2018).

Mitigation Scheme


Malware is considered the most important threat as this threat attacks mainly the network system and it is harmful to information disclosure. Simply it can be said that Malware is an intrusive software specially designed for damaging or destroying the computer system and the outcome of this threat is loss of important data from the computer system. For m mitigating this threat, the computer system should be kept updated as well as other excessive links or documents should not be downloaded in the computer system (Aslan & Samet 2020). Apart from that, for mitigating the attacks of this threat it should make sure that the computer system should have a good backup for removing this threat from the system. Besides this, a scanner must be used for identifying the issue for this threat and set a watchman to resist the attack of this that. For mitigating the attacks of this threat the user must be aware and have a good knowledge of this threat.


Figure 1: Mitigation techniques of Malware threat
(Source: Self-created)


This thread is very harmful to the computer system as this threat mainly attacks Email and this threat can be mainly found in large business organizations. For mitigation of this threat, the users should be aware of this threat and also know the mitigation techniques. To detect this threat user must be aware of the URL classification scheme, loss estimation as well as strategies for mitigating this risk factor from the computer system (El Aassal et al. 2020). In the scheme of URL classification, the user should know the JAVA script and HTML features.


Table 1: Mitigation of Phishing threat
(Source: Self-Created)

MitM Attacks

The man in the middle attacks mainly on the network system of the computer system which h is the main cause of the information disclosure as well as security systems. This threat is mainly found in the business of E-commerce as well as financial commerce. This threat mainly creates a barrier between the user and the server (Lahmadi et al. 2020). The attack of the following threat can be mitigated by using a VPN which is very helpful for encrypting the web traffic. Apart from that, by connecting only with secured Wi-Fi routers one can mitigate the attack of this threat.


Table 2: Mitigation of MitM Attacks
(Source: Self-Created)

DOS Attack

DOS attack is one of the most significant threats for the computer system as this threat is gradually emerging in network security. This threat is mainly found in high-profile business organizations and it mainly attacks the network system and stops all the services of the network. This threat can be mitigated by monitoring network traffic as well as analyzing it properly (Dwivedi, Vardhan, & Tripathi 2020). The basic detection policy for this threat is to examine all the packets as well as detection the network flow. Apart from that, CPRS based approach is considered the most important mitigation policy in this threat. On the other hand, some prevention management systems must be included for mitigating this threat such as VPN and content flittering. Apart from that, combining farewell, as well as anti-spam, is also considered an important management system for detecting g this threat.


Table 3: Mitigation of DOS Attack
(Source: Self-Created)

SQL Injection

This threat is considered as one of the most significant threats of the network system as this threat mainly tampers the important data of a computer system. This threat can be found in any business organization which is based on a network system as well as a technology-based organization. This threat basically attacks the server system and hampers the work process of the system. This threat can be seen during the time of cyber-attacks when a hacker applies malicious code to the server of the system (Latchoumi, Reddy & Balamurugan 2020). In order to mitigate this threat, one should input validation in the computer system as well as parameterize all the queries which include already prepared statements. This particular application code should not be ever used as input directly to the computer system. Apart from that, by using the stored process the mitigation of this threat is possible and most importantly all the inputs which are supplied by the user should be escaped.


Table 4: Mitigation of SQL Injection
(Source: Self-Created)

Zero-day Exploit

This threat refers to as exploitation of network voluntary information and this threat can be found in any organization (Blaise et al. 2020). The mitigation policy of this particular threat is to find out the time opf attract as well as the time of dispatch of this threat.


Table 5: Mitigation of Zero-day Exploit
(Source: Self-Created)

Password Attack

Password attack is one of the most significant threats of a technology-based organization and this threat is mostly found in a computer device of the IT business organizations. This threat can be mitigated by following these stages such as phishing as well as credential attacking in the network system. Apart from that, key loggers, MitM, and dictionary attacks should be reduced for mitigating the emergence of threats.


Table 6: Mitigation of Password Attack
(Source: Self-Created)

Cross-site Scripting

This threat is mainly harmful to websites for E-commerce business organizations as well as other companies too.


Table 7: Mitigation of Cross-site Scripting
(Source: Self-Created)


This threat is mostly found in the technological system and caused data disclosure.

Table 8: Mitigation of Rootkits
(Source: Self-Created)

IoT Attacks

This threat is mainly found in IT organizations which is very harmful for the elevation of privileges.


Table 9: Mitigation of IoT Attacks
(Source: Self-Created)


Taking into consideration from the above text it can be concluded that there are several kinds of cyber threats that can be very harmful to networks as well as computer systems also. Defining all the requirements of security management is the first step for this model and then an application should be created. Apart from that, finding out the potential threads is also very important and after that, the threads should be mitigated for close security. For evaluating the potential risk factors the threat modeling is considered a proactive strategy that includes identification of the threats as well as improving tests or the process for detecting those threats. Apart from that, the approach of threat modeling should be to make out the impact of the threats as well as classify the threats. Application of the proper countermeasures is also included in the approach of the threat model.


Read More

MIS604 Microservices Architecture Report Sample

Assessment Task

This research paper should be approximately 2500 words (+/- 10%) excluding cover page, references and appendix. In this assessment, you need to present the different issues that have been previously documented on the topic using a variety of research articles and industry examples. Please make sure your discussion matches the stated purpose of the report and include the cases study throughout.

Discuss and support any conclusions that can be reached using the evidence provided by the research articles you have found. Details about the different industry cases studies should NOT be a standalone section of the paper.


The microservicesis one of the most rapidly expanding architectural paradigms in commercial computing today. It delivers the fundamental benefits of integrating processes, optimization and Instructions delivering efficiency across many areas. These are core benefits expected in any implementation and the MSA is primarily configured to provide the functional business needs.

On-the-one-hand, MSA can be leverage to provide further benefits for a business by facilitating:

- Innovation— reflecting the creation of novel or different services or businesses processes, or even disruptive business models.

- Augmented Reality — reflecting the situation where superimposing images and data on real objects allowing people to be better informed.

- Supply chain— reflecting how the MSA enables closer communication, engagement and interactivity amongst important external or internal entities.

On-the-other-hand culture is the totality of socially transmitted behaviour patterns, attitudes, values and beliefs, and it is these predominating values and behaviours that characterize the functioning of an individual, group or organisation. Organizational culture is what makes employees feel like they belong and what encourages them to work collectively to achieve organizational goals. Extant IS implementation studies have adopted culture theory to explain how organisations respond to implement a MSA system in their workplace, and how these responses lead to successful or failed implementations.

As a professional, your role will require that you understand the benefits of MSA, especially in these three areas, which are significantly becoming the preferred strategy to achieve competitive advantage for many organisations. The purpose of this report is to engage you in building knowledge about how these benefits achieve in an organisational environment with a specific focus on how and why organisational culture can influence the successful implementation of an MSA within an organisation.



Microservice Architecture (MSA) has evolved from Service Oriented Architecture (OSA). For the most part, microservices are smaller and more focused than the big "services" from the 2000s. A very well-made interface is exposed by these apps, which are hosted and made available over through the network. For Assignment Help Using a so-called “RPC," other programmes can access this interface (Remote Procedural Call)(Fernández-García et al., 2017). Around 2,200 key microservices, dubbed "Domain-Oriented Microservice Architecture," have been added to Uber's infrastructure (DOMA). This paper presents the views on how Uber utilized Microservices to bring performance, agility and scalability in their organization while focusing on three key tenets specifically Supply Chain, Augmented Reality and Innovation. Furthermore, the importance of culture and how culture affects MSA adoption is also discussed in the paper.
Microservices for Uber


Today's customers are extremely empowered, driven, and self-determinant. They are fast to choose the greatest sophisticated and / or the cheapest choice since they have all the information and computational power they need at their disposal. As a result, they should be regarded as "internal" customers. Consumers aren't any longer satisfied with the IT department's clunky and restricting software. In that same respect, a customer will not find it pleasing to use an application that allows him to book a cab, but rather than getting it done quickly, ends up taking longer than making a phone call. As a result of their success, high-performing enterprises were three times more inclined to pursue first-mover edge(Fernández-García et al., 2017). For example, starting a news website is far easier than starting a newspaper. The inability to acknowledge the value of speed, flexibility, and agility would have a significant negative influence on a company's capacity to thrive(Ghirotti et al., 2018).

Uber, on the other hand, would be constrained by its monolithic architecture to make significant modifications to their system depending on client demand because of their design:

- Expensive and time-consuming

- Too inflexible to change and as a result, too sluggish to take advantage of the situation

- There are times when no one individual can fully comprehend the structure, even though this is virtually a need.

- Because they aren't built to open standards, the skill pools available to companies are rather limited.

- As a result of the difficulty of managing these systems, users are compelled to find alternative means of getting business accomplished outside certain systems (frequently sticking with more laborious manual and prone to human error methods including in the case of Uber, booking a cab with phone calls or opting for a traditional Taxi/Cab).

Apart from the above, traditional Monolithic architecture would limit Uber because it would be hard to customize and any changes to brought into the system would result in a high failure rate as a lot of elements would need to be unshackled.

The current system at Uber was large and homogenous as a new release of any one small feature required the release of the entire system, thus presenting a risk to their systems as a whole. The proliferation of mobile devices only exacerbated this dilemma, with multiple types of devices, models and operating systems to manage as an Uber Passenger could be holding any of the 1000s of type of mobile devices being in use today. Similarly, Amazon was unable to quickly implement new features because of the large number of developers distributed around the company. Customers were unable to use any important code updates for weeks since they were blocked in the deployment process. Amazon's pipeline was simplified and shortened because to the introduction of microservices. A service-oriented design enables developers to identify bottlenecks, identify the characteristics of these slowdowns, and reconstruct them as a small team devoted to each service, overall resulting in innovation.

Figure 1 - Uber Microservices (Gluck, 2020)

APIs, which serve as that of the "contract" linking microservices, are quite a critical mechanism for liberating out of monoliths. Uber's trade balance and exchange info microservice, for instance, might be used to illustrate this point. Uber will not be capable to meet riders in over 60 currencies across the world if the application was "cobbled" altogether as in monoliths, which would hinder true innovation and limit actual revenue potential(He & Yang, 2017).

Augmented Reality

The branch of computer science known as "augmented reality" (AR) deals with the integration of real-world data with that created by computers. The augmented reality tech may be utilised on mobile devices, including such smart phones, as well as personal computers. If an Uber driver, one may use the app to assist customers locate their cars more quickly, or the other way around. When it comes to picking up passengers from their destinations, the Uber app uses integrated Augmented Reality Control Module (ARCM) to aid passengers in meeting up with drivers who are available. Trip request data, including pick-up position, drop-off destination, and sometimes even departure timing if it's a planned ride will be sent to Uber by user. Based on one’s trip request, Uber would then match passengers with various local drivers and provide the pick-up information to the first driver whoever agrees. Uber tracks the driver's progress as he or she approaches the pick-up spot. Once you've arrived within a predetermined distance of the pickup point, Uber would send a notification to your phone instructing it to broadcast a live stream from ones device ’s camera. Uber then uses image recognition to detect whether the driver is available in the live video stream depending on the driver's information, such as the vehicle's make, type, colour, and registration. By computing a vehicle value, which is also dependent on driver characteristics, a trained machine predicts that an oncoming or halted car is ones cab. On top of that, Uber uses augmented reality (AR) features on the live broadcast to identify an incoming car as a taxi.

Figure 2 - Uber AR Patent (Patent Yogi, 2019)

The aforementioned architecture can be implemented using a 4-Tier architecture comprising of Designer, Supplier, Intelligence and Customer Tier.

Customer Tier

In order to govern events including such examining virtual furniture, speaking with designers, or placing orders, there is a management piece on the consumer tier. There are a number of subcomponents that allow for the exploring and displaying of available cabs under the controller. There are several different types of markers, all of which are printed on the same sheet of paper. Marker form and location might be captured and recognised by a smart device ’s camera. The visualization component uses the marker to display the cab. The communication component allows customers and designers to connect with one another either orally, via video pictures, or via a live videoconference. There are several uses for this component, including communication as well as capturing markers.

Designer Service Tier

Service containers incorporate the services that perhaps the system delivers to designers, such as rendering, viewing, and web services, in a single location. System, design service tier, as well as customer tier data are sent between the information processer as well as the data processer's info tiers.

Supplier Service Tier

The characteristics of controller, communication, service container, as well as information processor make up the supplier service tier, just as the designer service tier. The operations of the elements are comparable to others on the designer service tier, although they may be used for other purposes. This might include services like scheduling and delivering rides to customers; alternatively it could include services like offering current transportation alternatives to designers. For example, the service capsule could comprise services like these.

Intelligent Service Tier

Reflex agent models have been replaced by motivated agents and learning agents in their computational model. In order to activate learning, programming, or other agent functions, the motivation process uses information from the observed environment as well as its own memory. It sets goals and encourages people to work towards them. To comprehend and replicate interesting activities that happen in their environments, agents form objectives. When using a 3D map, it may be used to locate nearby cabs and to overlay relevant information, such as the amount of steps it will take one to go to the vehicle. A database would be used to store the information gleaned from the learning process.

Supply Chain

The entire supply chain of Uber is based on the aggregator model. This means, that Uber plays a mediator role in connecting services requesters to service provisioners. There is a large-scale fulfilment procedure at the heart of the entire process As such the entire system is based on demand and supply that are scattered across a large geographic space. Therefore, one can naturally expect a plethora of problems when trying to get these dissimilar systems/components, to function as a logical unit/entity(Ke et al., 2019).

To put this in context, one can imagine an Uber car that render services with one passenger, then another passenger, and so on over a vast geographic area and period. Consequently, not just the services are getting exchanged, but also payments are all being handled by numerous financial institutions along the supply chain's cycle. In addition, the current supply chains lacks one critical component visibility. This further complicates the full process. Any supply chain solution or product should indeed solve the challenges listed above in order to be successful. Products and solutions that effectively solve these issues, without jeopardising the integrity of data or transactions they are built upon, will be more successful than those that do not. The route to success in a distributed world is an efficient design that works and expands. SaaS systems could be complicated and large-scale, but there is no one architecture or technique that can be used to create them(Krzos et al., 2020). Similarly, Etsy was plagued by performance issues for a number of years prior to its use of microservices. It was imperative for their technical team to reduce server processing time, and that they were coping with simultaneous API calls, which are not even easily enabled in their PHP framework. Etsy optimized their supply chain with a transition to MSA.

Microservices and process choreographic capabilities are two examples of such an architecture(Valderas et al., 2020). Uber's supply chain architecture would include the following elements in attempt to be built:

- Service encapsulation: Encapsulating services is a well-known technique in Service-Oriented Architecture (SOA). Simplicity of isolated apps can be hidden by API contracts including message canonicals. Distributed architectures are known for their loose coupling, fluid service interactions as well as the ability to orchestrate business processes that span various organisations and applications. This platform is designed to assist these capabilities.

- Event-Driven Architecture: Supply chain products and solutions, in contrast to typical monolith systems, should be event-driven and sensitive enough to respond and adapt to the dynamism of the ecosystem. As a sender and the receiver of multiple business events, each service in the environment acts similarly. An event is published by a microservice (or agent) under this architecture whenever a business state change takes place, for example, "Ride has been booked," "Ride has been finished," and so on. These events are then subscribed to by other microservices as well as agents. As soon as agents are notified of a noteworthy occurrence, they can make changes to their own commercial enterprises and publicise more related activities. If the Ride status is changed to “Cancel” by the customer, it can trigger the “Cancellation charges” which inturn notifies various stakeholders about the same(Krzos et al., 2020).

- Process choreography: Each of the numerous apps that make up a distributed application architecture must communicate with the others in order to reach a common goal, resolution, or aim choreography distributes business logic in an event-driven system, during which a service (or agent) is initiated by an event to accomplish its obligation, such as the proof of delivery event produced by a vehicle tracking system, which triggers the accounting system to begin the payment process, for instance. The system is comprised of several services of this type. More closely matched with real-world settings, process choreography extends above and beyond orchestration. This method makes it simple to implement process changes in a matter of hours rather than weeks or months(Lian & Gao, 2020).

- Unified data: The harmonization of master data is yet another critical component of this architecture, which is required for the effectiveness of whatever supply chain product or service. All consumers in the supply chain network should have access to this data, which is scattered across silos (groups, domains, and apps), if they are to make effective choices in real time. Due to the complexity of connecting to various data sources, creating high-quality master data as well as a primary source of information in any dispersed system is difficult. In addition, retrieving, transforming, and cleaning up master information in real time is a difficult task.

- End to end visibility: Digitalization and data unification from many components into a single perspective are made possible by event-driven architecture, which allows supply chain activities to be executed and monitored without hiccup. There are numerous advantages to this approach, including the identification of processes that are in compliance as well as those that are in violation, as well as the opportunity for process optimisations – allowing for greater flexibility and adaptability to the ever-changing requirements and wants of the business.

- Collaboration Tools: All supply chain systems, especially those used by firms like Uber, rely on tools and technology that make it possible for users from across domains worldwide networks to connect, collaborate on projects and made appropriate real - time decisions.

Organisational culture can influence the successful implementation of an MSA

The following cultural foundations are essential for a implementation of the said Microservices:

Diverse talents

Because microservices are always changing and evolving, the personnel who manage the architecture must have a strong desire to learn. Therefore, it wasn't enough to just employ a diverse team of experts just for sake of hiring; the greatest team of engineers must be assembled. It's easy to overcome the different difficulties that microservices will present if one has a well-rounded and experienced team on the side(Lian & Gao, 2020).

Freedom and control

A company's culture is a major role in the effectiveness of microservices architecture management. Companies can't migrate to microservices if they still have traditional procedures and methods in place, which severely limits their capacity to reap the benefits of the change. A dispersed monolith culture means that a company's microservices adventure will not succeed if it has requirements like permissions for each new modification or commit, or perhaps even undoing changes.

Support systems for developers

First, one has to recognise that they'll be investing a lot of extra time establishing support systems for their engineers and DevOps groups so that they can be productive during this shift. Allowing your engineers the freedom to make their own decisions is essential to a loosely connected design that needs a great deal of faith in their judgement. Netlix built the correct checks and balances within their system in ahead to guarantee that it couldn't be exploited on one hand and even though that this would also develop with them while they grew and maintained this essential aspect of their culture as business grew.

Optimized communication flow

The acceptance of microservices is strongly linked to the organizational structure and culture of a business. As a result the information flow within a company is highly conducive to the success of microservices. When these teams are able to make their own judgments and execute well-informed improvements, feedback mechanisms and heightened agility(Zheng & Wei, 2018).


The current software development process benefits from the use of a microservices architecture. It reduces development and maintenance expenses while minimising risk. Concurrent development, debugging, installation, and scalability are all made feasible. This design enables programmers to take use of both small- and large-scale programming. Due to the reduced complexities and organizational knowledge required to be productive, it allows for a wider range of applicants to be considered. Rapid and agile changes occur on a regular basis. Your clients' requirements may be met swiftly and with unparalleled responsiveness and assistance if you are prepared. The above case of Uber is a brief snapshot of how Uber’s transition into MSA and paving the way for innovation, supply chain optimisation and augmented reality can help the company build the future of urban transport system.


Read More

MIS500 Assignment Research Report


The assessment suite in this subject is built on top of skills you are building in your learning activities. Although the learning activities do not carry any assessable weight by themselves, completing them and seeking input from your peers and the learning facilitator is essential for you to achieving a positive result in the subject. While researching and writing this assessment, be sure to work through content and the learning activities in modules one and two.

Sneakers have revolutionised fashion, lifestyle and the environment.

Sneakers and streetwear have revolutionised fashion, lifestyle and the environment.

Global Footwear Market Report 2021, reports that the Sneaker Market is Projected to Reach $403.2 Billion USD by 2025 - Driven by New Design Trends and Rising Discretionary Spending Among an Expanding Middle Class Population. From Nike, Toms, Puma, Adidas to Converse, Veja, Yeezy to Gucci, Louis Vuitton and Chanel everyone is wearing sneakers. Kanye West, Mark Zuckerberg, Taylor Swift, Virat Kohli, Beyonce and people from all walks of life both young and old are wearing sneakers. The sneaker industry like all industries has had to pivot itself to environmentally friendly and sustainable sneakers. Spain, Italy and many countries in South America are leading the way in producing sneakers made of recyclable materials including food.

In this assignment you will audit, analyse and evaluate the social and digital media of an environmentally and sustainable sneaker brand.


Describe the history of the environmentally and sustainable sneaker brand (Nike, ID.Eight, Toms, Allbirds, Veja, Flamingos Life, Cariuma, Native, Nisolo, Sole Rebels, Oliver Cabell, Thousand Fell and Adidas). You can use Australian or international sneaker brands.


Discuss (in the 3rd person) why this environmentally and sustainable brand was chosen to be Audited, analysed and evaluated.

Audit and Analysis:

Visit the brand’s website and audit their social media platforms. You should be investigating the traditional website advertising and the social media platforms (Facebook, WeChat, Instagram, Pinterest, Snapchat, QQ, Tumblr, Qzone, LinkedIn, Youtube, TikTok, Twitter etc.).

As a future IS professional audit, analyse and evaluate the brands website and use of social media that is currently present.

Based upon research, is the website and social media platforms engaging? Evaluate, discuss and provide evidence.

Discuss how your chosen brand engages their audience in its marketing of Corporate Social

Responsibility (CSR) sneakers. Your discussion should centre on the production of ecofriendly and sustainable products. Does the company or retailer actively engage with their customers? Using research that you have conducted in the landscape of social media discuss whether the website and social media platforms are engaging? Justify using evidence.

Recommendations using Porter’s Value Chain model

Use the Porter’s Value Chain model to identify and explain the business’s primary activities using the company website and the social media channels to obtain this information. (Marketing and Selling).

Make three recommendations to the Sneaker Company or Sneaker Retailer on how Porter’s model can enhance or maximise marketing (exposure and impact) selling (increase sales traffic)


Discuss the actions that the Sneaker Company or retailer should engage in so as to increase sales and engage more actively with its customer base in the purchase of ecofriendly and sustainable products. What other types of social media (based upon research) should be introduced to the company or retailer?


Make three recommendations to the Sneaker brand on how the company can enhance or maximise the value offered to ‘customers’ by producing more ethical sneakers and delivering a strong message via Social Media and their official website.



Nike Inc is an American sneaker company that is engaged in the design, development, manufacturing, and marketing of world-class sneakers and shoes. For Assignment help, This footwear apparel has the largest supply of athletic shoes and it is the leading manufacturer of sports equipment (Boyle, et al., 2019). In this report, an evaluation of the social and digital media of an environmentally and sustainable sneaker brand, Nike will be done.


Nike is environmentally sustainable in manufacturing its sneakers in a cruelty-free way. So, it has been chosen for this audit report. The sneakers are made of fruit waste, recycled materials, and are environmentally sustainable. The sneakers are made in Italy from innovative andenvironmentally sustainable wastes for unisex and distributed all over the world. Nike also follows a sustainable packaging for the sneakers that allow people to save the environment by disposing of the boxes. These boxes are mostly made of disposable and bio-degradable products. Nike focuses on the efforts in using raw materials to reduce its water footprint. The company is trying hard to reduce the use of freshwater in manufacturing high-quality shoes that are used for dyeing and finishing textiles. Nike promotes wastewater recycling to make environmentally sustainable sneakers (Boyle, et al., 2019).

Audit and Analysis

Nike still believes in traditional marketing on its websites and social media platforms. In today’s world, traditional product marketing is very much alive and used by large companies. Nike promoted their products based on "emotional branding" by using the tag "Just Do It" on their website and social media platforms. Nike is now focusing on strong branding through social media hashtags, posts with emotional captions that lift up the people’s spirit, etc. Nike has come up with traditional branding tools that increase the brand's appeal among local people as well as celebrities all around the world (Center for Business Communication, et al., 2021).

By choosing the right person for advertisement and branding, Nike gains the trust of common people. Nike's digital marketing channel is large enough to distribute knowledge about the products effectively. Nike also holds subsidiaries like Converse, Jordan, and Hurley that help them to grow. Nike has also collaborated with Facebook Messenger Bot to promote their special product, Nike Air. In order to create this campaign, Nike teamed up with an AI platform named Snaps. It established a conversational setup between the company and the customers where news about the products are sent to the customers on a weekly basis. The news is divided into 3 categories, such as; Air Jordan, Shop and watch (Thompson, 2016). Facebook Messenger Bot enables a two-way conversation between the people and the company that provides a unique opportunity to connect directly to Nike's Air Jordan. The bot is effective in making conversations with an open rate of 87% (Henry, 2012). The users/customers can set up notification time and collect useful website links for buying the products. So, it can be said that Nike has a strong digital advertising system where social media is quite engaging. In 2020, Nike has spent $81.9 million on community development globally. Since 2001, Nike has focused on its public commitments and aligned its operations with business priorities. Nike's corporate governance shows that the company has strong commitments to monitor the effectiveness of policy and decisions taken by it for a long time. It approaches governance to increase its long-term shareholder value in the global market. It also enhances the CSR, human rights and improved sustainability of its products. Based on Nike’s global report, it spent $417 million in communities and $130 million in promoting equality and improving sustainability in the environment (Thompson, 2016).



Recommendation with Porter’s Value Chain model

Porter's Value Chain model deals in the strategic development of the management tool that gives power to the company to increase its value chain.
Inbound logistics: Nike focuses on product quality and sustainability as they are the main reason for their success. The company focuses on inbound logistics to promote quality checks, sustainability. It has independent contractors, manufacturers and more than 500 factories globally to operate (Henry, 2012).
Operations: this includes manufacturing, assembling, packing and testing sneakers in the warehouse before distribution. Nike focuses on operation analysis on a regular basis to improve productivity and efficiency and increase value.

Outbound logistics: this includes activities to store and distribute products all over the globe through retail marketing (Henry, 2012).

Marketing and sales: The primary activities that the company undertake in marketing and selling include inbound logistics, operations, outbound logistics, marketing and sales, and service (Karvonen, ET AL., 2012). The goal of the five sets of activities is to create business values to generate higher revenue for the company. Nike promotes its products through emotional branding, story-telling, physical marketing, and promotion through social media channels (Thompson, 2016).

Service: It has 100 million customers as per the report of 2017 that the company wants to keep. So, it provides customer registering service, discount facilities, etc (Henry, 2012).


Three recommendations to Nike on how Porter’s model can enhance or maximise marketing (exposure and impact) selling (increase sales traffic):

• Nike is always focusing on improving its primary business activities. So, to improve its business value it should follow Porter’s model.

• Nike uses its website and social media channels to do all of its business activities that make them more reputed and trustworthy. So, it should use porter's model to maximise marketing activities over the website and social media channels.

• By using the services and operations model, the company can promote its selling.


It is important for the Company to enter the mind of its customers and hold loyal customers through proper promotion and branding. Nike should include promotions on tumbler and LinkedIn. Nike should use digital marketing more to engage its loyal customers and increase the base. This should be done through eco-friendly marketing and a sustainable product manufacturing system. It is analysed from the study that Nike Inc. has a revenue of USD37.4 billion in 2020 and is forecasted to increase much higher in the next decade. The company spends a lot on branding its sports equipment through digital media channels and websites. A few recommendations on improving the supply chain of the company to meet required needs in the market.


In this section, three recommendations are made for Nike Inc. on how it can enhance or maximise the value offered to the customers by producing more ethical and eco-friendly sneakers.

1. It is recommended to increase the sales per customer to hold loyal customers through digital marketing and using effective social media channels.

2. Nike should retain customers longer through offers and discounts and providing good quality products. Nike should fulfil consumer demand in every season.

3. It is recommended to lower the cost of the sneakers to increase business value. Also, it can lower the cost by using renewable and recycled resources in making footwear. Nike should completely stop using freshwater for any kind of manufacturing purpose as freshwater is not sustainable for manufacturing and also very costly to maintain.


Read More

MIS607 Cybersecurity - Mitigation Plan for Threat Report Sample

Task Summary

Reflecting on your initial report (A2), the organisation has decided to continue to employ you for the next phase: risk analysis and development of the mitigation plan.

The organisation has become aware that the Australian Government (AG) has developed strict privacy requirements for business. The company wishes you to produce a brief summary of these based on real- world Australian government requirements (similar to how you used real-world information in A2 for the real-world attack).

These include the Australian Privacy Policies (APPs) especially the requirements on notifiable data breaches. PEP wants you to examine these requirements and advise them on their legal requirements. Also ensure that your threat list includes attacks on customer data breaches. The company wishes to know if the GDPR applies to them.
You need to include a brief discussion of the APP and GDPR and the relationship between them. This should show the main points.

Be careful not to use up word count discussing cybersecurity basics. This is not an exercise in summarizing your class notes, and such material will not count towards marks. You can cover theory outside the classes.


Beginning with the threat list:

- You need to align threats/vulnerabilities, as much as possible, with controls.

- Perform a risk analysis and determine controls to be employed.

- Combine the controls into a project of mitigation.

- Give advice on the need for ongoing cybersecurity, after your main mitigation steps.


- You must use the risk matrix approach covered in classes. Remember risk = likelihood x consequence. (Use the tables from Stallings and Brown and remember to reference them in the caption.)

- You should show evidence of gathering data on likelihood, and consequence, for each threat identified. You should briefly explain how this was done.

- At least one of the risks must be so trivial and/or expensive to control that you decide not to use it (in other words, in this case, accept the risk). At least one of the risks,but obviously not all.

- Provide cost estimates for the controls, including policy or training controls. You can make up these values but try to justify at least one of the costs (if possible, use links to justify costs).



Network security breaches ends up costing millions throughout the world because of various cyberattacks that target hundreds of network assets, including network software and hardware as well as information assets. As per Chahal et al., (2019), “an attacker executes a scan throughout the entire network to find vulnerable hosts, compromises the vulnerable host by installing malware or malicious code (e.g., Trojan Horse), and attempts to carry out actions without the knowledge of the compromised hosts” That's why it's important to have a network security system that protects users' private information while also allowing them to communicate with one other. For Assignment help Threat Modelling is the process of identifying, assessing, and evaluating possible hazards to a system. With this category, it is possible to identify dangers in an orderly fashion. Due of STRIDE (Spoofing, Tampering with Repudiation and Information Disclosure and a Denial of Service) being a comprehensive risk model, this discourse serves as a justification to use it in place of the other threat models (Aikat et al., 2017).

In the present situation, the packers want to protect their system since their vendor, JBS Foods, has been the victim of a cybercrime in the past. Security experts have been brought in to assess the risks and vulnerabilities associated with the intrusions. This article will continue the threat discovery which was done in the previous paper with the use of data flow diagrams, context diagrams, and the STRIDE approach. Thus all vulnerabilities and threats pertaining to attack are discussed in this report. The report would further go into the details of providing a risk matrix, along with a threat control and mitigation scheme. Cost computation will also be included for the threats listed.

Australian Privacy Act vs GDPR


- People who are alive are protected under the GDPR. Private details of deceased persons is not protected by the GDPR since Member States are responsible for enforcing their own laws. Privacy Act safeguards the private information as to a 'natural persons,' described as 'individuals,' under the statute. Because "individual" indicates a live person, this same Privacy Section isn’t applicable for deceased individuals, even though it is not explicitly stated so.

- It is possible for public bodies to just be data controllers as well as data processors under the GDPR. All APP organisations, public or private, are subject to the Privacy Act.

- Both the GDPR as well as the APP allude to private information as "Personal Data," yet they are fundamentally referring to the very same thing (Yuste & Pastrana, 2021).


- People who are alive are protected under the GDPR. Private details of deceased persons is not protected by the GDPR since Member States are responsible for enforcing their own laws. Privacy Act safeguards the private information as to a 'natural persons,' described as 'individuals,' under the statute. Because "individual" indicates a live person, this same Privacy Section isn’t applicable for deceased individuals, even though it is not explicitly stated so.

- It is possible for public institutions to just be data controllers as well as data processors under the GDPR. All APP organisations, public or private, are subject to the Privacy Act.

- Both the GDPR as well as the APP allude to private data as "Personal Data," yet they are fundamentally referring to the very same thing.

Risk Evidence Analysis

Table 1- Risk Evidence Analysis

Threat List & STRIDE Categorization

Table 2 - STRIDE Categorization

Meaning of Risk Levels and Likelihood

Figure 1 - (Stallings & Brown, 2018)

Figure 2 - (Stallings & Brown, 2018)

Threat Analysis

Table 3 - Threat Analysis


Man in the Middle Attack

Threat – One way to refer to an attack where a perpetrator places themselves in the midst of an interaction among a user as well as an application refers to it as "the man in the middle" (MITM for short). This can be done for eavesdropping purposes or by pretending to be among the participants in the dialogue.

Likelihood : 4 Consequence : 5

The threat has quite a high level of chance of happening in reality and thereafter the impact associated with it is significantly low. Therefore the aforementioned likelihood and consequence rating is chosen.

Risk Level : Extreme

Standard mitigation

- Security policy for the entire organization is a must
- Employee training program and education
- Regular IT security auditing

Specific mitigation

- IPSec
- Network Monitoring Solutions
- Segmentation of Network

Techniques: Avoid Risk

End-Point Attack

Threat – End-point attacks are any attack that may come from malware, spear phishing, insider or any other means but attack the very end-user devices.

Likelihood: 3 Consequence: 4

The threat has medium level of chance of happening in reality and thereafter the impact associated with it is a bit high. Therefore, it poses a medium level risk.

Risk Level: Medium

Standard mitigation

- Security policy for the entire organization is a must
- Physical security and biometric authentication wherever necessary
- Following a IT Security framework such as TOGAF and ITIL.

Specific mitigation

- Endpoint Hardening
- Password and Biometric lock
- Anti-virus and Anti-malware solutions
- Firewall on Endpoints

Techniques: Mitigate Risk

SQL Injection Attack

Threat – SQL Injection are attacks that target the database contained and connected to online forms and portals. Social networking sites, webstores, and institutions are among the most often targeted web apps. Medium and small organisations are extremely vulnerable to SQLI attacks because they are unfamiliar with the methods that fraudsters employ and how to counter them (Goel & Nussbaum, 2021).

Likelihood : 5 Consequence : 5

The threat has quite a high level of chance of happening in reality and thereafter the impact associated with it is significantly high as well. Therefore it is an ‘extreme level’ of risk.

Risk Level: Extreme

Standard mitigation

- Regular IT security auditing
- Routine vulnerability scanning
- Following a IT Security framework such as TOGAF and ITIL.

Specific mitigation

- WAF (Web Application Firewall)
- Web sanitization schemes
- Input validation techniques
- Captcha systems
- Whitelist & Blacklist known fraud IPs

Techniques: Mitigate Risk

Emotet Attack

Threat – To propagate Emotet, junk mail is the most common method of transmission. Viruses can come in a variety of ways, including malicious scripts, macro-enabled documents, and more. Some anti-malware programmes are unable to identify Emotet because of a feature in the software. Helping spread Emotet is provided through worm-like characteristics. This aids in the spread of the virus. The Dod has concluded that Emotet is among the most expensive and damaging viruses, affecting commercial and government industries, individuals and organisations, and incurring well over $1 million every event to sweep up (Zhang et al., 2021).

Likelihood : 4 Consequence : 5

The threat has quite a high level of chance of happening in reality and thereafter the impact associated with it is significantly low. Therefore, the aforementioned likelihood and consequence rating is chosen.

Risk Level: 20

Standard mitigation

- Bring your own device policy must be created
- Regular IT security auditing
- Routine vulnerability scanning

Specific mitigation

- Executable execution prevention
- User privilege definition
- Email spam filtration
- Anti-macros
- Endpoint security systems

Techniques: Mitigate Risk

Drive-by Attack

Threat – A drive-by download exploit exposes the digital device toward a vulnerability by downloading malicious programmes without user knowledge or consent (Hoppe et al., 2021).

Likelihood : 2 Consequence : 2

The threat has quite a significantly low chance of happening in reality and thereafter the impact associated with it is significantly low. Therefore the risk level is low.

Risk Level: Low

Standard mitigation

- Bring your own device policy must be created
- Security policy for the entire organization is a must

Specific mitigation

- Eliminating any outdated systems, libraries or plugins (Liu et al., 2017).
- Updating all systems
- Web-filtering software

Techniques: Accept Risk (Controls are reject in this because, the cost associated to solve is extremely high as the entire systems would need to be restructured and re-thought which involves a detailed planning, business disruption and resulting business losses)

Phishing Attacks

Threat – Phishing attacks now are the practise of sending phoney emails that typically come from a trustworthy organisation. Phishing emails and text messages often leverage real-world concerns to entice recipients to click on a link. In order to encourage individuals to respond without considering, scam mailings (or phishes) could be hard to detect. Text, mail, as well as phishing scams are the three most common forms of assaults on the Internet (Sangster, 2020).

Likelihood : 3 Consequence : 5

The threat has quite a medium level of chance of occuring and the impact of that is high. Therefore the risk level is medium.

Risk Level: Medium

Standard mitigation
- Bring your own device policy must be created
- Employee training program and education

Specific mitigation

- SPAM filter
- Anti-virus and Anti-Malware
- Block Fraudulent Ips
- Forced HTTPs on all communications
- 2-Factor Authentication

Techniques: Avoid Risk

Attack on Passwords

Threat – Simply said, hackers aim to steal passwords through password attacks by guessing, bruteforcing or other means.

Likelihood: 4 Consequence: 5

The threat has somewhat high level of probability of happening in reality and thereafter the impact associated with it is significantly high. Therefore, the aforementioned likelihood and consequence rating is chosen.

Risk Level: Extreme

Standard mitigation

- Bring your own device policy must be created
- Employee training program and education
- Physical security and biometric authentication wherever necessary
- Regular IT security auditing

Specific mitigation

- Complex passwords
- Password policy
- Storing of passwords in encrypted format
- Using SSO (Single-Sign-On and 0Auth) based logins

Techniques: Avoid Risk


Threat – Ransomware is software that uses encryption to keep a victim's data hostage and demand a payment in exchange for their release. To mitigate for said malware's ability to disable the whole operational network, or encrypting an user ’s information, and also because of their size and willingness to pay, major corporations are the primary targets of ransomware attacks (Shaji et al., 2018).

Likelihood: 4 Consequence: 5

The threat has somewhat high level of probability of happening in reality and thereafter the impact associated with it is significantly high. Therefore, the aforementioned likelihood and consequence rating is chosen.

Risk Level: Extreme

Standard mitigation

- Regular IT security auditing
- Routine vulnerability scanning
- Following a IT Security framework such as TOGAF and ITIL.

Specific mitigation

- Anti-Malware and Anti-Spyware tools
- Regular vulnerability scanning
- Auditing of vulnerabilities
- Employee training on Ransomware

Techniques: Avoid Risk

Breach of website using E-Skimming

Threat – With the rise in popularity of online shopping, a cyberattack known as e-skimming is becoming increasingly common. For a long time, ATM and gas station skimmers posed a threat to customers, but the practise has evolved recently. These affect the privacy of the individual as it can steal ‘Personal information’ as outlined in Australian Privacy Act (Shaukat et al., 2020). Third-party JavaScript and open-source libraries are exploited by attackers to get access to websites' Shadow Code. To get access to online services, cybercriminals often use documented zero-day flaws in 3rd JavaScript. S3 Storage buckets as well as repositories may potentially be vulnerable to attack because of a lack of proper security measures in place. A digital skimmer steals credit card information by injecting malicious code into third-party programs on the website. Third party scripts as well as libraries used among websites are the primary source of these assaults, which are also known as supply chain attacks.

Likelihood: 3 Consequence: 3

The threat has quite a medium to low level of chance of happening in reality and thereafter the impact associated with it is also medium to low. Overall risk remains low.

Risk Level: Low

Standard mitigation

- Security policy for the entire organization is a must
- Routine vulnerability scanning

Specific mitigation

- Patching the website
- Using PCI-DSS Compliance
- Multi-factor authentication
- Data encryption

Techniques: Avoid Risk

Breach of website using CSS

Threat – Malicious scripts can be introduced into normally safe and secure websites using Cross-Site Scripting (XSS). Malicious code can get entry to device's cookies, digital certificates, and other confidential material since it appears to have come from a trustworthy source. In most cases, cross-site scripting exploits enable an attacker to assume the identity of a vulnerable user, conduct any activities the user may take, and gain access to some of the user's personal data. The hackers might able to take complete control of the programme and its data if the target has elevated status inside it.

Likelihood: 5 Consequence: 4

The threat is quite high in terms of probability of happening and impact is also somewhat high. Therefore, it can be categorized as extreme risk.

Risk Level: Extreme

Standard mitigation

- Bring your own device policy must be created
- Following a IT Security framework such as TOGAF and ITIL.

Specific mitigation

- Input Sanitization
- Output escaping
- Content Security Policy

Techniques: Mitigate Risk


The paper listed down all the major cybersecurity attacks that are applicable to PEP keeping in mind the attack on JBS Foods. As a result a lot of the newly developed attacks such as phishing based attack, ransom attacks, malware attacks, DoS, SQL Injection attacks, E-Skimming attacks and so on are included keeping in mind the threat landscape of recent years as well as the nature of the business. Attacks within each type are classified further and explained in detail. Furthermore, the paper introduced a set of countermeasures and mitigation scheme classified according to the defence strategies for PEP.


Read More

PROJ-6012 Managing Information Systems/Technology Sample


In this subject the students will understand the significance of project management and get introduced to the methods and tools available for managing projects, especially those related to Information Technology. They will recognise the importance of alignment of the projects with the goals and objectives of the organization and hence learn about the methods used to select the project. Throughout course delivery, students will be encouraged to present their opinions on the topics covered in each module by posting their messages on the discussion forum. Assessment 3 is intended to evaluate the responses of the students which will highlight their understanding about the topics. The discussion postings will be initiated by the facilitator as the topics are covered in each module and the students will be expected to reciprocate to the discussion questions. They will be required to present their views, either supportive (or contradicting) by considering their knowledge in the discipline, prior experience in the industry or existing published work.

These discussions will provide immense benefit to the students as they will get an opportunity to learn from the experience and knowledge of other students and the facilitator. They will get updated with the current issues in the industry. Further, the students will get an opportunity to present their own thoughts and knowledge in the discipline, which will enhance their confidence and skill. Discussions on professional forum will increase their communication skills and academic writing skill which will have far reaching benefits in their careers.

Besides that, the facilitator will get an opportunity to understand the background, knowledge and level of understanding of each student. This becomes more important for the online course as there is minimal face to face communication between the students and the facilitator. This will help the facilitator to evaluate the students better and provide the required support to the students.

Hence, the students are encouraged to actively participate on the discussion post, learn from the discussions and utilise the opportunity provided to showcase their skill and knowledge in the discipline.


Module 1

Discussion Topic 1: Controversial Impact of IS/IT Projects

The Australian government is well-known for its focus on security and the implementation of modern technologies for controversial operations for maintaining security within the country. The implementation of drones in airports for passenger movement tracking processes, bomb detection and more contributed to an overall increase in security (Meares, 2018). However, the major information security project that led to controversies involved checking digital communications to provide security. For Assignment help, The project involves the collection of digital messaging data from various organisations such as Facebook, Apple and more to track possible terrorism, analysing behavioural patterns of suspected people and more. This process, however, can be stated as controversial as it requires these technological brands to make backdoor access within their secure messaging platforms such as WhatsApp and iMessage. Therefore, the security of the applications developed by Apple and Facebook may be affected negatively and could cause serious issues with hackers or unauthorised people targeting these backdoors for access within the social media networks.
The development of backdoor access and essential data collection by hackers or unauthorized people can lead to the exposure of essential details of a large number of people using these messaging services which can eventually lead to vulnerabilities and negative impacts on the lifestyle conditions of people (Hanna, 2018). The Australian Parliament was involved in passing the controversial legislation and developed a law that allows the Australian government to collect data. However, there has been significant criticism from the Australian population in regards to the fact that the government is involved in breaching the privacy of people. As a project manager, I would focus on improving technology implementations and invest in more modernised technologies rather than be involved in leading to possible security failures for major brands allowing communication facilities to people worldwide.

Discussion Topic 2: Managing Existing Architecture

In the case of an existing architecture being inefficient in contributing to project needs, the ideal solution involves upgrading the architecture by the use of newer ideas and concepts. However, as the company has made several high-value investments in technology, it may not be efficient to invest more to change the architecture. Investing more can lead to loss of finances and can lead to issues with long-term brand operations. Therefore, as a project manager, I feel that it can be much more efficient to integrate Agile project management processes. Agile involves the use of Scrum for testing the system while ensuring the use of Continuous Improvement processes for improved project operations (Paquette & Frankl, 2016). Continuous improvement is efficient in tracking various segments or components of the project that may be improved and these improvements can be made based on the lowest number of changes in financial requirements.

The project changes and improvements within the low number of financial requirements can lead to higher performances of the entire system without affecting budgets. It can ensure that the project operates to the best of its abilities based on the existing architecture and changes in operational processes for the highest gains in project goal achievement. Additionally, scrum-based testing can also be efficient in conducting reviews of the efficiency of the existing architecture and the changes in strategic operations. However, significant time is required for implementing these changes and in case of the need for project completion within a short time, investments in the architectural changes may be more efficient.

Discussion Topic 3: Computing Models for Global Applications

The global IS (Information Security) or IT (Information Technology) projects operate in close relations with available infrastructure and a lack of infrastructural benefits can contribute to negative impacts on operations. Additionally, technology is constantly improving and growing at a fast scale and thus traditional forms of computing models may not be efficient in the application of modernised IS and IT projects. However, utility computing allows the personalisation of IS or IT operations due to the efficiency of manually managing various resources to be used. This can allow the implementation of these projects as per needs and ensure changes in processes such as storage, data transmission, service management and more as per requirements.

The geographic scope of a project also contributes to the selection of a specific model of operations of these IT and IS projects. The geographic scope refers to the availability of specific resources, services and products in a specific location. In case of the unavailability of a base architecture for the development of a project in the selected location, it is highly essential to implement changes in strategic action. The procurement of materials from other countries or regions for the development of a base architecture can contribute to higher costs and thus traditional models may be much more applicable for operations (Cook, 2018). The use of these traditional models may affect the project operational speed and efficiency negatively but can have serious positive impacts on the sustained operations of the project along with long-term growth and developments.

Response 1

The IS/IT projects containing virtual team members can be managed efficiently with the implementation of the resource management tool. The resources include materials required for the development of a project, equipment required to utilise the selected materials and the human resources or people involved in the development processes. The project management applications such as MS Project allows tracking these resources along with the development of a schedule for project developments which can be used to achieve the goals in regards to the management of IS/IT projects. I believe that the use of a work breakdown structure can also be efficient in setting various tasks which includes tracking the human resources or team involved.

Module 2

Discussion Topic 1: Business Process Reengineering

The visual stream mapping tool is very much helpful that providing stakeholders, leaders, team members, stakeholders focusing on the unified view. This fresh view makes them in stepping out of the data silos and gain a more holistic acknowledgement of the whole procedure and their correspondent contributions and roles ensuring the completion of the finished product. This further perspective helps each user to observe their more significant contributions, essentials and, values concerning the product delivery procedure. Without the support of a value team mapping tool, team members may lose their perspective, discount or distort the significant value of their role.

For example, using the VSM tools helps the team members to achieve understanding and clarity and understanding the value of their roles in the project and helps in improving the team and individual morale. Using the value stream mapping tool results in complexity in multi-operational processes confuses (Hari, n.d.). The limitation is the low variety manufacturing approaches, bias on high volume that generally favours the assembly line setups, geared for continual flow. Complying with the process workflow fails the consideration of allocations of activities. For example, for WIP storage, the utilization of shop floor space, production support and material handling result in confusion within the process and is unable inn show the influence on the WIP, operating expenditure and order throughput of ineffective material flows within the facility. The procedure of VSM helps employees in recognizing the areas of improvisation and function better towards the achievement of the goals and objectives.

Discussion Topic 2: Project Scope Creep

A perfect example of scope creep is altering the project's scope in meeting the customer's changing requirements. It results in appearing overwhelming at the present moment but serves a higher purpose. Hence, before the commencement of a project, the authority must be disclosed to the probability of scope creep and planning for it. For example, an effective example of project scope creep is a notable delay in accomplishing the project because of clients' continual change requests, as seen within the lawsuit in between the project manager responsible for the project.

Project scope creeps as the change is generally inevitable and may impact the project scope; it is at least required need to know how adequately the scope creep can be managed. There must be the implementation of ways to manage the scope creep to help the project in meeting objectives (Kerzner, 2017). Communication portrays a vital role within the project management that helps in dealing with the changes that help in accomplishing the project objectives. For the project manager, most communication is generally is passed through them. The primary thing is the project team and other stakeholders generally treat them as the chief communication point in the project responding to the changes. Organizations may incorporate transparency within the project and portray an essential role within the project management. Everybody must be aligned on the same page as by the progress of the project. This would support the teamwork collaboratively function by the faster delivery and meeting the different set of objectives.

Response 2

The focus on IS/IT in recent years has increased due to the need for modern technology implementations for continued operations. However, IS/IT should not be a strategic driver and rather focusing on proper planning processes can contribute to more significant positive impacts on the project. IS/IT and modern technologies should be implemented to support and work upon the plans developed, thereby leading to efficiencies in project completion. Based on my understanding as a project manager, I feel that the organisation is involved in the implementation of projects based on a plan of action that involves an assessment of the inputs required for the project, expected outputs, budget and time-frame required and more.

Module 3

Discussion Topic 1: Schedule Management

Defining the project goals and writing down the key deliverables or milestones or deliverables for making the project ended successfully. Recognizing the stakeholders and making a list of every individual requires to be involved in the project, even with a simple role. Determining the final deadline and ultimately knowing when the project will be finished that will be entailing the requirements. It must be ensured for giving enough time in accounting for the conflicts that would come in the way of the project (Marion, 2018). Listing every task and considering such deliverables and milestones and deliverables designated in the step and bifurcating them into smaller components and subcomponents to be covered. Assigning a team member responsible for every activity and deciding the allocation of components and subcomponents, transparent to deadlines. Working backwards for setting dates for every task. It must be figured for every activity knowing delay is inevitable very well so that it does not disturb the project. Sequencing is a vital consideration as well as several activities will require to be completed. Then organizing the project schedule in one tool, and then share it with the team. At last, successful creating a project plan and is significant in the organization in such a way that every member involved may observe and work accordingly to it. By progressing through a project, it reflects on the project managers and how they use the schedule framework to complete the project. The framework is designated to share clear information for avoiding challenges.

Discussion Topic 2: Cost Management

IT professionals are much more focused on the completion of the project rather than on cost management. IT professionals are also very much concerned with the development of the project and very potential and probable issues that may result in happening. IT professionals may overlook their views on the project costs and concentrate upon the procedure, as managing the costs within the project is very much essential for the project success and also in avoiding the overruns for the expenditure (Kerzner, 2017). Many of the IT professionals are connected to a limited business atmosphere and that is the reason they do not know the importance of some accounting concepts or any different financial principles. This is considered as another vital reason IT professionals overlook the project cost. Cost management among the IT project is considered a difficult task for the organization that results in encountering varied issues and thus the cost estimation and is generally undefined sometimes. It results in beam difficulty and specific determinants required for assessing the successful accomplishment of the project. This process helps in under defining the cost management and results as one of the difficult activities for IT professionals. Sometimes the IT professionals are not able in getting an adequate need or eventually possess an undefined need in the initial stages of the project. It is required to have a specific but thorough need that should be evaluated for understanding and disclosing the budget. This makes IT professionals overlook the cost management and focus on somewhere else.

Response 3

In a typical IT project, the first set of activities involve the development of a plan for changes and the implementation of changes based on the basic framework of the IT system. The cost and time implemented for planning out new developments in the project contribute to the sunk costs. These plans include the development of a foundation or base for the project. For example- In the case of the development of a product, the internal circuit board performs as the base of operations and once developments are initiated, the board cannot be reused for a different activity. This leads to sunk costs in case of improper development processes.

Module 4

Discussion Topic 1: Quality Control in Projects

There are several types of quality control techniques used by the organizations such as histogram, control charts, six sigma, scatter diagram, etc. However, in the selected organization there will be the use of the six-sigma technique. It is one of the important methods used by organizations for the better functioning of business activities. As per Ottou, Baiden & Nani (2021), Six Sigma is a type of control or methodology that helps in improving the business procedure with the use of statistical analysis. It is mainly data enabled and highly disciplined approach and methodology that helps in ensuring the elimination of defects present within the business and in any type of organization or business procedure. The goal of the six-sigma method is for streamlining the quality control within the business or manufacturing procedure so that there is little to no variance within the procedure. The goal present in the concerned Six Sigma project is in identifying and eliminating any type of defects that cause variations within quality by explaining the sequence of stages across the focused target.

One of the vital reasons for implementing Six Sigma is considered important is it helps in decreasing the defects. By utilizing the Six Sigma method, employees get capable of recognizing the problem spheres alongside recurring challenges that impact the full quality expectation for the product or service concerning the consumer's viewpoint. The technique of the six Sigma procedure possesses the necessary skills and tools for identifying the challenges or bottleneck spheres that may camp down the performance or production.

Discussion Topic 2: Managing Human Resources

Paying individuals their worth by setting the salaries of employees must be ensured that the pay should be consistent with other organizations within the industry and geographic location. Providing a pleasant environment to work as Everyone needs to work in an atmosphere that is stimulating and clean and should make them feel good rather than bad. Offering opportunities for self-development as members within the team are valuable to the companies, and for themselves, when there are opportunities for them to learn fresh skills (Word & Sowa, 2017). By providing the team with the training required to advance and get knowledge will help to remain in touch with the latest news.

Foster collaboration in the team and encourage the team members to wholly participate through inviting the input and suggestions to do things adequately. By asking a question and getting the responses will help in implementing the solution to change.

Encouraging happiness to employees tend to be positive and enthusiastic members of the team, and it should be checked continuously if individuals are happy with it or not and then take necessary steps. Setting clear goals is the job of leaders and team members to work collaboratively in the team for setting clear goals. As it is done, the goals should be ensured, their concerned priority and role to meet them. Avoiding useless meetings as Meetings may be a wastage of unproductive meetings continually. By preparing an agenda for the meetings and distributing it in advance and inviting, initiate meeting and finish it quickly.

Discussion Topic 3: Conflict Management

Common experiences of conflict happening within the team is a misunderstanding or mistaken perception. This arises between the employees, leaders and employees etc. by failing the communication between them. The information passed about something is either misrepresented as information or is interpreted in the wrong way. This in turn results in the way towards discomfort resentment or prevention. This need directly to be solved by clearing the misinterpreted possibilities arising between the individuals. Compromising is considered the most popular technique for solving conflicts in projects (Posthuma, 2019). All party’s interests are satisfied to an extent where their compromise is successful. Professionals are the ones who ask for help when needed if they truly understand that the conflict is not in their capacity to solve, then they call in the sponsor or ask the project sponsor to help.

Appeasement is mostly effective in circumstances when admitting a point is inexpensive for one but beneficial to the other individual or team. By delegating process project managers need a great deal of work as well as responsibility for managing as delegating conflict resolution to the concerned individual, the individual is offered a chance to develop themselves. Another best way to resolve conflict is by brainstorming sessions to be used within the organizational project. By identifying the situations and or locating the problems before creating damage, there should be brainstorming sessions held to develop a powerful interaction between them. This would enable understanding each other and develop strong communication in addressing the problems of conflicts.

Response 4

Recruitment and team retention are achieved via the use of proper HR management procedures. The HR management processes have limited impacts on the projects as a whole but contribute to the availability of skilled personnel for the development of the projects. These HR processes involve a focus on providing better rewards for employee performances. Financial rewards such as grants, bonuses and more for high-value performances within the project can be highly efficient in achieving retention of employees. Similarly, skilled employees can be recruited by ensuring quality financial provisions based on their abilities. Furthermore, the use of non-financial rewards such as feedbacks can also lead to positive mindsets and ensure retention of employees.

Module 5

Discussion Topic 1: Project Communication Methods

The benefits comprise participation, as the most important benefit of interactive whiteboards is that it enables higher participation compared to whiteboards. Preserving Data, as Every data by interactive whiteboard results by the connected system and straight projected into the whiteboard and depicts do the recording directly to a hard drive and also transporting on portable storage (Heldman, 2018). Several visuals may be utilized in interactive whiteboards and videos may be uploaded by websites or prior saved files. The disadvantages include the fact that misunderstandings or inadequate communication may result in misunderstanding that in turn results in mistakes, missing deadlines and changing project directions. Miscommunication results when the individuals share information without precisely understanding each other (Gronwald, 2017). This results in misinterpretation of details and facts prompting team members for working for perceived data and facts. Performance reporting entails dissemination and collecting the project information, communicating project development, utilizing the resources, and gathering future insights and status to several stakeholders. There are Status Reports that provide the present state of a project in the mentioned time. Such a report explains the place project stands at the moment concerning the performance measurement baseline. Forecasting report depicts what is been expected to happen on a project, predicting the expected status and future performance of the project in different parameters and helping in allocating and tracking resources for optimum utilization. Trend report presents a comparison within the present performance of the project as well as past performance of the project within the same duration.

Discussion Topic 2: Stakeholder Engagement

Stakeholder engagement is revered as the method by which stakeholders engaged in the project collaboratively work together. There are several methods by which the participation of stakeholders is done to ensure the sharing of information and, messages effectively to accomplish the objectives of the project. Delegation is one of the most important methods for stakeholder engagement that results in effective communication and executing the tasks effectively.

Delegation can be referred to as entrusting the part of activity or responsibility and authority to others and holding them accountable for the performance. Delegation is when authority assigns work to subordinates for which they are liable. Delegation is very significant in effectively executing the tasks as it ensures the completion of work done in time (Seiffert-Brockmann, Weitzl & Henriks, 2018). It is seen as highly important and ensures the decision is taken collaboratively agreeing on the decision. This method helps in determining the requirements of stakeholders are being determined from the onset of the stakeholders themselves and by the authorities as well as the communities by their representatives deciding the medium to intervene and act together. This approach provides the existence of stakeholder participation and also continues beyond the establishment stage. This method also includes the monitoring and evaluation practices for helping pinpoint the shortcomings of the plan eyeing the probable future improvements.

Discussion Topic 3: Conducting Procurements

Competitive negotiation is a source selection approach that is also known as positional bargaining. In this parties consider holding to their positions and are inflexible to the interests of a different party. Competitive bidding is used in public projects that are generally stipulated by the law. This source section approach is assumed when two or more contractors are willing to compete in the work. It needs time for creating plans and specifications, preparing a bid, assessing submitted bids, and awarding the contract. It provides adequate detail concerning the specifications, performance duration, and workmanship quality expected for the project.
Non-competitive negotiation is another source selection process that is used for rewarding contracts. Non-competitive negotiation is revered as the establishment of contractual conditions and terms that includes but is not restricted to contract price, by discussing to a single vendor, with external procedures incorporated for the competitive bidding that will entail the contract terms or technical particulars not defined specifically (Bhargove, 2018). In this method, the developer and contractor would assess the pricing information and technical proposals collaboratively for reaching the costs for the work as well as the agreed scope.

Competitive negotiations are another process that provides proposals solicited by selected offerors, by whom the developer consequently negotiates for achieving the best value. It also provides the developer to refine their requirements as well as the scope of work by preliminary negotiations by chosen offerors, and those offerors would then submit competitive bids formed on the agreed-upon needs and scope of work.

Response 5

In case of the need for outsourcing of human resources, it is highly essential to focus on recruiting skilled employees or project team members at the location of operations. The development of positive relations with vendor brands that may be able to handle the outsourced operations can be efficient. This can allow the brand to ensure some of its operations are outsourced to other countries while the other activities are conducted within the organisation itself, leading to high-value brand performances and achievement of necessary goals. Negotiation processes with the local team can also help to ease the severity of the issues faced due to a lack of human resource outsourcing processes.

Module 6

Discussion Topic 1: Risk Management

Project risks are common and every single project has the possibility of experiencing a certain number of risks. Several risks arise in the project that may result in distorting the project or business failure and Project risk analysis is done for monitoring projects' performance through start to end eliminate the loss or business failure. Types of risk present are: Cost risk is mismanagement and shortage of project funds through an inflated budget or other constraints depicted as the risk to the project accomplishment. Scope creep risk is an unauthorized or uncontrolled change in the initial intended project scope that can result in the extra cost of further products, features, and functions (Wolke, 2017).

Operational risk can terminate or stall in case there is a weak implementation of crucial operations and core procedures like procurement. The risks can lead to indirect or direct loss concerning the failed or inadequate strategies such as IT risk, human and procedure direct implementation risk etc. Skills resource risk helps capitalize the internal staff is potentially a high project risk as sometimes the project operations get staggered in distinctive waves at different locations, needing the required team members alongside technical risk.

The risk register is eyed as a strategic tool for controlling risk within the project. The risk register identifies and describes the risk list. It provides space in explaining the probable impact on the project and the response planned to tackle the risk that occurs. The risk register also enables the project manager in prioritizing the risk.

Discussion Topic 2: Portfolio Risk Management

Portfolio risk management refers to the idea of identifying, assessing, measuring and managing various risks within a portfolio. The project portfolio mainly includes insights on the operations of the project taken into account, resources used for project completion, strategic goals of the brand and more. Therefore, the proper assessment of these processes is highly essential to ensure the tracking of portfolio risks and the mitigation of these risks.

As a project manager involved in the development of a cloud computing system project, the project plans are first assessed to track their efficiencies in the application. This involves ensuring that capitalisation on the plans is possible and the execution of operations can be achieved in an efficient matter by following the plans (Stewart, Piros & Heisler, 2019). The viability of the project can also be assessed which involves tracking the efficiency of the product to be developed and its expected market growth. A large number of cloud computing services are easily available in the market and this operational process can help to track the efficiency of this specific service in gaining market advantage. High possible market advantage can lead to increased opportunities for product sales and profitability of the cloud system.

The efficiency of the cloud system in contributing to returns on investments are also taken into account. The ROI is dependent on the price at which the cloud systems are provided and the time by which there was a return in finances for these investments. The cloud systems are being provided at significantly low costs and thus it is expected that the return on investments will be high. These high ROIs can contribute to increased utilisation of the cloud system within the market, thereby achieving market growth.

Discussion Topic 3: Process-Based Measurements

The success of an IT/IS project can be evaluated via the use of key performance indicators. The main key performance indicators of these projects include the number of customers or organisations utilising the IT/IS project services, positive views of customers or brands towards these project services, return on investments, impact on efficiency and more. Based on the number of customers, a project can be stated as successful if a high number of customers are involved in utilising the product or service. A large customer group utilising the services indicate high-value service provisions and the success of the project being developed (Project Management Institute, 2017). In regards to the Australian government project concerning the collection of data from Apple and Facebook, the customers using the services include the government officials and these processes contribute to them being able to collect data efficiently, thereby indicating project success.

The positive views of customers towards the project are a significant indicator of project success. IT projects such as new cloud computing systems and more experiencing positive reviews from customers indicate that the project is efficient, well-defined and successful. Similarly, various organisations also use IS and IT-based services and their positive views towards these services along with the long-term collaboration of operation requests can indicate that the project is successful (Project Management Institute, 2021). The return on investment refers to financial returns gained in comparison to the investments. In the case of companies experiencing higher financial gains than the return on investments, the project may also be stated as successful. The efficiency of the project in achieving brand and individual goals can also indicate efficiency in operational processes.

Response 6

In regards to IS/IT projects, the two most critical success factors that I take into account are the demand of customers or organisations involved in procuring these IS/IT services and the speed and efficiency of project operations. In case of high demands for the product or service developed by the project development process, it can be stated that the project is efficient and provides value to customers. Similarly, the speed of operations is essential in the modern world to achieve a large amount of work within a short time. Therefore, an increase in the levels of customer satisfaction due to the speed of operations can ensure that the IS/IT projects are efficient.

Reference List

Read More

MIS611 Information Systems Capstone Report Sample

Task Summary

For this assessment, you as a group is entering the second phase of your assessment process - Assessment 2, where your key output will be a Solution Prototype Document. By now your team would have completed the first phase via the delivery of the Stakeholder Requirements Document - the key output of Assessment 1. It is important to note that consistency and continuity from one assessment to the next are vital in this project.

You will need to ensure that you use the project approach as advised in Assessment 1. This means that your solution needs to address the requirements documented in
Assessment 1 Stakeholder Requirements Document.

For Assessment 2 - Solution Prototype Document, you as a team is required to complete a 4000-words report outlining the various aspects of your solution. It is expected that you will demonstrate how the solution addresses the requirements outline in Assessment 1. A variety of prototyping tools are available to you. However, will need to discuss your selection with your learning facilitator to establish feasibility of the team’s approach. The Solution Prototype Document should describe elements of the Solution Prototype using the appropriate tools for processes, data and interfaces.


In the previous assessment, you demonstrated your proficiency in the requirements analysis and
documentation process in alignment with the project framework that you selected. In this phase of the assessment cycle, you will design and develop your solution prototype in alignment with your selected project approach in response to the requirements elicited and documented. As outlined in Assessment 1, this will reflect your professional capability to demonstrate continuity of practice, progressive use of project frameworks and their appropriately aligned methods, tools and techniques.

Task Instructions

1. Review your subject notes to establish the relevant area of investigation that applies to the case. Re- read any relevant readings for this subject.

2. Plan how you will structure your ideas for your report and write a report plan before you start writing. Graphical representation of ideas and diagrams are encouraged but must be anchored in the context of the material, explained, and justified for inclusion. No Executive Summary is required.

3. Write a 4000 words Solution Prototype Document outlining the various aspects of your solution that addresses the requirements outline in Assessment 1.

4. The stakeholder requirements document should consist of the following structure:

A title page with the subject code and subject name, assignment title, case organisation/client’s name, student’s names and numbers and lecturer’s name


Solution 1: Token Based System

Payment systems could be account-based as well as token-based. By subtracting the payer's balance and crediting the recipient's institution in an account-based billing system, a transaction is completed and completed (Allcock, 2017). For Assignment Help, This means that the transaction must always be documented and the people involved recognised. Payment is accomplished by transferring a token that is equivalent to a certain amount of money in a system based on this principle. When it comes to currency, coins and banknotes are the most obvious examples. It is better to have a token-based system where another CBDC token is like banknotes as well as referred to as "coins." Trying to withdraw money from a bank account, users load coins into their computer or smartphone and have that amount debited from their savings account by their bank. Unlike other digital bearer instruments that are held in a central database, the CBDC would be cached on the user's computer or mobile device. A record of the owner's name is likewise missing from the CBDC's database.

Blind signatures, a cryptographic method, are used to ensure privacy. A blinding operation conducted locally mostly on user's device conceals the numeric value indicating a coin again from central bank beforehand seeking the signature without interacting with the central bank to receive a cryptographically signed coin. This numerical number is a public key under GNU Taler, and only the coin's owner has access to its corresponding private key. Also on public key of the coin, the federal bank's signature is what gives the currency its worth. The central bank uses its very own private key to sign the document. If a retailer or payee has access to the central bank's "public key," they could use it to confirm the signature's validity and that of the CBDC. (Fatima, 2018).

Users don't have to rely on the central bank or perhaps the financial institution to protect their private spending record since the blind signatures being performed out by the users themselves. Just the entire amount of digital currency withdrawn and the actual sum spent are known to the central bank. There is no way for commercial banks to know how much digital currency their customers have spent or in which they have spent it. As a result, secrecy is not an issue when it comes to maintaining privacy in this system because anonymity is cryptographically ensured.

Solution 2 - Non-DLT Based Approach

Distributed ledger technology (also known as DLT) is being tested among most central banks (DLT). In the absence of a centralised authority, a blockchain or distributed ledger technology (DLT) may be an attractive design option. However, in the event of a retail CBDC issued from a reputable central bank, it is not necessary. When the central bank's registry is dispersed, it only raises transaction costs; there are no practical benefits to this practise.

Improved extensibility is a major advantage of not utilising DLT. Our suggested technology would be scalable as well as cost-effective, just like modern RTGS platforms used mostly by central banks today. As many as 100,000 transactions per second may be handled by GNU Taler. Secure storage of about 1-10 kilobytes every transaction is the most expensive part of the platform's cost structure. GNU Taler's memory, connectivity, and computing costs at scale will be less than 0.0001 USD each transaction, according to studies with an earlier prototype.

Furthermore, since DLT is indeed an account-based system, establishing anonymity is an issue. A decentralised append-only database is used instead of a central database to store the accounts, which is the sole distinction from a standard account-based system. Zero-knowledge proofs (ZKP) and other privacy-enhancing crypto methods are viable but computationally intensive in a DLT setting, hence their deployment on mobile devices is impracticable. This doesn't really pertain to GNU Taler's Chaum-style signature verification system, which is fast and reliable (Gupta et al., 2019).

Solution 3 - Regulatory Design

Central banks would not be privy to the names or financial transactions of customers or merchants under the proposed system. Only when digital currencies are withdrawn as well as redeemed awhen the central banks are able to track them (Kaponda, 2018). Commercial banks could restrict the amount of CBDC a particular retailer can get per transaction if necessary. Whereas the buyer's identity remains anonymous, the seller's operations and contractual responsibilities are made public upon inquiry by the appropriate authorities (Kaponda, 2018). The financial institution, tax authorities, as well as law enforcement can seek and review the commercial contracts underpinning the payments to establish if the suspected behaviour is criminal if they identify odd tendencies of merchant income. As mandated by that of the Europe's General Data Protection Regulation (GDPR), the system uses privacy-by-design plus privacy-by-default techniques. Neither merchants nor banks have an intrinsic understanding of the identities of their clients, and central banks remain blissfully unaware of the actions of their population (Kirkby, 2018).

Disintermediation of the banking system is a common problem with retail CBDCs. Even though it would be a severe issue with account-based CBDCs, token-based CBDCs ought to be less of an impediment (Oni, 2021). Comparable to hoarding cash, the danger of theft or loss associated with a token-based CBDC would be the same. Withdrawal limitations and negative interest might be implemented by central banks if hoarding or huge transfers of money from bank accounts to CBDC become an issue (Kadyrov & Prokhorov, 2018).

Central banks, businesses, and individuals might all profit from the proposed architecture. Because of its cost savings, this system is supposed to be the first to handle long-envisioned micropayments digitally. Smart contracts would also be possible if digital currency was used to ratify electronic contracts (Sapkota & Grobys, 2019).

Using a newly developed plugin for GNU Taler, parents or guardians can restrict the use of money supplied to their wards to make digital transactions, while still maintaining their anonymity. To keep the name and exact age hidden, merchants would simply know that the consumer is of legal age to purchase the things they are selling. For instance, central banks may use it to create programmable currency like this.


Read More

MITS5505 Knowledge Management Report Sample


This assessment item relates to the unit learning outcomes as in the unit descriptor. This assessment is designed to assess the knowledge of implementation of a knowledge management solution utilizing emerging tools for stages of knowledge management.

This assessment covers the following LOs

LO3: Conduct research on emerging tools and techniques for the stages of knowledge creation, acquisition, transfer and management of knowledge and recommend the most appropriate choice based on expert judgement on the practical needs.

LO4: Apply and integrate appropriate KM components to develop effective knowledge management solutions.

LO5: Independently design a usable knowledge management strategy by application of key elements of a good knowledge management framework and by incorporating industry best practices and state of the art tools such as OpenKM or other emerging technologies.


These instructions apply to Major Assignment only.

Answer the following question based on a case study given overleaf

Give your views on implementation of knowledge management based on five distinct stages of knowledge management:

Stage 1: Advocate and learn
Stage 2: Develop strategy
Stage 3: Design and launch KM initiatives
Stage 4: Expand and support initiatives
Stage 5: Institutionalize knowledge management

Case study: A leading bank of Australia

You have been appointed as a Chief Knowledge Officer in one of the leading investment firms of Australia to accomplish a project which is to develop a knowledge base guide for the customer service staff to provide better services to the customers of the investment firm. Your task would be to implement Knowledge Management system considering tools and techniques and using KM components for the investment firm which can be helpful in providing better services to the customers of the firm and that too in a very efficient manner.


Knowledge Management System

Knowledge Management, this term means the internal process of creating or sharing a company's information and knowledge. For Assignment help, The primary goal is to make efficiency improvements and retain the secret of the main information within the company(Khan, 2021). Being a Chief Knowledge Officer (CKO), I have to control and manage the information resources of the company or firm. There will be a surety of effective usage of knowledge resources. There are some stages included to work on KM (Knowledge Management). It works efficiently. There are three types of knowledge management such as

• Explicit knowledge - It covers those which can easily be covered in the written way (documents) in a structured manner. It includes raw data, information, charts, etc. It can be used in any job, institution work, or some official work that can be presented to the audience.

• Implicit knowledge - It is the second step of the knowledge after explicit. If we’ve made an airplane then the next step in implicit is how to fly that airplane. This type of knowledge is eliminated from a formal knowledge basis.

• Tacit knowledge - It is a comprehensive and tough language that is not easy to understand easily. It’s difficult to explain straightforwardly. It is learned with time. It’s an informal language learned with experience with time and applied to particular situations(Khan, 2021).

Benefits: Some benefits of Knowledge Management:

• Improvement- It helps in improving the quality of users.
• Satisfaction- It helps to meet the level of customer satisfaction.
• Self-service- It creates awareness regarding self-service adoption.
• Reduction- It reduces time wastage in training, costs, etc.
• Adoption- KM helps to get a faster response in new services.
• Business Demands- Increase response in changing demands of the users (Garfield, 2019).

The implementation of a real knowledge management system in the lading bank of Australia uses five implementation stages that are given below:

Stage 1: Advocate and Learn

Advocacy is the complete first assessment to distinguish knowledge management, address it to people to the leading bank of Australia, and create a fundamental little meeting of knowledge management helpers. The opening need to be provided to the bank staff to get know about KM by practices. It is additionally required to make everyone acknowledge how knowledge management can be aligned with different recent activities of companies. To build knowledge management interesting to the more broad crowd, it is necessary to use basic language to examine openings, genuine problems, and the possible worth that knowledge management addresses. The main cause why the leading bank of Australia has failed is the haste they developed in adopting several resources of financial nature and political nature in planning without any carefulness. When the bank of Australia initiates to create their workers to store knowledge except for transmission & dissemination of the data is the very phase when bank invites failure. Motivating every worker to transfer their formless knowledge may become useless material for the bank of Australia. To get the support of staff, a knowledge management team has to elaborate their aims related to this specific project so that anyone can treat themselves in this. KM team needs to introduce the problems and how the KM plan could help to gain the aim of a team or individual along with the benefits of the KM system. The techniques or tools that support the KM plan may differ. Generally, tools are connected to several categories like knowledge repositories, expertise access tools, search enabled tools, and discussion and chat tools that support data mining (Semertzaki & Bultrini, 2017).

Stage-2: Develop Strategy

It is the second stage of KM implementation in an organization. KM system is to create an approach that needs to be consistent with the aims and objectives of the organization. It is also required to create a KM team that can treat itself into KM system implementation completely. The KM team needs to work on the approach and put this approach in the activity for successful KM system implementation. Moreover, the bank of Australia requires recognizing assets that several be utilize of this strategy. The strategy was initially well-created based on practices that will have to be executed by each member of the KM team by posting investments. Here, we will talk about the pilot of the KM initiative that needs to get from the bank environment (Robertson, 2021). The business needs are required to be formed to install the KM strategy. There are some areas of a bank from that pilot project can be chosen such as:

• Bank area that is not going developing since it absence of knowledge linked to its field then KM can assist to create this field moving forward within the bank.

• And if the new business plan has been addressed to the bank then there will be required of installing the KM so that the workers of the bank can learn new skills linked to KM and the manner to execute jobs in this plan (Simmons & Davis, 2019).

The essential resources for the pilot project are human resources, budget, and project plan that will help its employees and processes realted to KM (Ivanov, 2017).

Stage-3: Design and launch KM initiative

The task force connected to the project has been created, reorganization of the pilot has been complete and the monetary resources with workers are assigned for the implementation of the project. This stage launches the pilot and collecting initial results. By using adequate investments for the whole implementation it is needed at this phase to create methodologies that need to be deployed and replicate measurements to capture and share the information linked to the learned lesson. This stage performs the initialization which needs to take data of specific indicators (Khan, 2021). KM also gets the benefit of using, sharing, and capturing data and knowledge obtained in a definite form. At the initial phase of initialization, we need to release funds for the pilot and allocate a group connected to KM such as a cross-unit team. The next phase is to create methodologies that can ignore and replicate the making of knowledge collections. The last phase of this stage is to capture and record the actions of the learned lesson (Disha, 2019). The budget of the pilot project included staff, man-hours, physical and non-physical resources, and supplies. Overall, the budget of the pilot project will be approximately $1,00,000. Once the pilot project deployment is under implementation and the consequence has been evaluated and assessed then the knowledge management plan will aim at one of the below-given paths:

• Current initiatives would be maintaining the status quo.
• KM efforts would go ahead collecting new initiatives

To get success, any KM initiative needs that you know your people very well and can make them understand things that need to be changed or upgraded. Reducing the duplication of work increases productivity. Tracking customer behaviors enhances the service of customers.

Stage-4: Expand and Support

At this stage, the pilot project will have been implemented and results are collected with some important lessons that are learned and captured. This stage contains actions related to support and expansion for KM initiatives by companies. The main objective of this stage is to develop and market an expansion approach all over the bank and handle the growth of the system efficiently. The initial phase of this stage is to create an expansion approach. To create this strategy there are 2 approaches are the criteria for pilot chosen for functions in different areas of the bank could be implemented or to pertain all approaches all over the knowledge management system. Corporate KM team, practice leader, knowledge officer, and core facilitator team who can handle the system. The second phase is to communicate and market the KM strategies. It could be complete with the support of various means that are broadly disseminating information of KM initiatives around the bank of Australia. The new employee hiring orientation needs to be incorporated with the knowledge management system training. Coordinators and training leaders are needed to teach the hired serviceman regarding the KM system so that they can familiar quickly. The last activity is to manage the KM system’s growth by managing the expansion of KM initiatives that occurs at this stage (Babu, 2018).

Stage-5: Institutionalize Knowledge Management

It is the last step of the implementation of the knowledge management system in an organization. This stage includes outlining the knowledge management system as an essential part of the processes of the leading bank of Australia. At this stage, the bank requires to re-specifying the approach related to the task, revisit assignments, and review the arrangement of its performance. It is also needed to recognize indicators (Disha, 2019. In the presence of one of the below-given terms is there then the knowledge management system is prepared to rock on the last stage of the knowledge management implementation:

• Each member of staff is trained to utilize the Knowledge management tools and techniques.

• Knowledge management initiatives are organized

• If the KM system is linked to the business model directly

There are some actions are taken by the organization to implement the KM effectively such as:

• The first action is to embedded KM within business model. This action is required to get executive and CEO support. It is required to identify budget and organization responsibilities to help exposure implementation of KM like a strategy.

• The second action is to analyze and monitor the wellbeing of KM system regularly. To ensure the KM system is well it is necessary to have the constant intervals pulse KM initiative.

• The third action is to align performance evaluation and reward system with approached of KM. Moreover, it is required to maintain the KM’s framework within the leading bank of Australiaalong with local control. The leading bank of Australia need to know individual groups in various areas to outline KM resources that will acoomplish their local needs.

• The nect action is to carry on the trip of KM. When the bank becomes the suitable knowledge-sharing company, the order for knowledge acquisition would be enhanced.

• The last action of the bank is to detect success feature to keep the KM spirit alive with following factors:

- Availability of motivating and consistent vision
- Capability to maintain full support of leadership (Eisenhauer, 2020).


Read More

MIS605 Systems Analysis and Design Report Sample

Task Summary

In response to the case study provided, identify the functional and the non-functional requirement for the required information system and then build a Use Case Diagram and document set of use cases.


System analysis methods and skills are of fundamental importance for a Business Analyst. This assessment allows you to enhance your system analysis skills by capturing the business and then functional and non-functional requirement of a system. It helps you in identifying “what” the proposed system will do and “how”?


1. Please read the attached MIS605_ Assessment 1_Case Study. Note that every piece of information provided in the following case serves a purpose.

2. Once you have completed reading the case study. Please answer the following questions:

Question 1

Identify all the human and non-human actors within the system. Provide brief description against every actor.

Question 2

Using the information provided in the case study, build a Use Case Diagram using any diagramming software.

Note: Please make assumptions where needed.

Question 3

Document all use cases (use case methods). All use cases identified in the Use Case Diagram in Question 2 must be elaborated in detail.



Nowadays, almost one out of three individuals have developed a habit of reading books. These days readers are reading books using their laptops, phones, and other technologies. For this, a young technopreneur has developed a website named ‘bookedbook.com’. She has developed a website that provides many interesting features for the users. (Tiwari & Gupta, 2015) Starting with the registration, users can launch their books online. Readers can read books of their choice and get some live sessions or events online. Authors can fill book show request forms for advertising their books online. For Assignment Help, They can go live and advertise up to 5 books in one session. Therefore, this website would prove to be an All-in-one platform for the users that mobile application will also be available for the users.
In this study, it is described by describing Use Case Diagram with all the necessary includes and extends.


Human actors

Non- Human Actors

Hardware Requirements

The laptop or computer with:

• 2 GB preferred RAM

• Internet Access of at least 60 k

• Minimum screen Resolution of 1026Χ74

• The hard disk of 4GB of space.

• Internet Explorer8.0+, Firefox 4.0+, IE 6.0+, Safari 3.0+. The browser must be Java

• Operating System Window 8, or Vista.

The server will be connected directly with the system of the client. Then the client will access the database of the server.


• Front end: User and content manager software is built by using HTML and JSP. The content manager interface is built by using Java. (El-Attar, 2019).

• Server: Apache Tomcat application server (Oracle Corporation).

• Back end: Database such as Oracle 11g


Use case diagram is a diagram that represents graphically all the interactions in the elements of the bookedbook.com website. It represents one of the methods used in system analysis for identifying and organizing all the requirements of the bookedbook.com website. (Zhai & Lu, 2017) The main actors of the bookedbook.com website are system users, book owners, Authors, and Content managers. (Iqbal et al., 2020) They perform several types of use cases such as registration, launch books, create launch reports, management requests, management book, book event management, select ads, book advertisement.

Figure 1 Use Case Diagram

Answer 3. Use Cases

System User Registration

Subscription of readers

Launching of books

Exchange of books

Live Meetup

Advertisements of book


Nowadays, all book readers are adopting the path of reading books online. The system users can get a subscription for the time duration of their choice. They can end the subscription at any time. Using the website users can exchange a book review or comment on the book. The launching of books is another added feature for the users that will attract more and more readers to the website. On the platform, authors can advertise their books. Therefore, it will surely bring a major transformation for all the readers.



Read More

MIS610 Advanced Professional Practice Report Sample

Task Summary

You are required to write a 1500 words professional practice reflection. This requires you to by reflect on your experiences in Master of Business Information Systems (MBIS), specifically the IS Capstone and how these experiences provided you with insights into your own future as an IT/IS Professional.


You have been studying Information Systems through a lens of Business, as most of you will chose careers in IS management and analytics. Technology is a strategic part of any enterprise today, large or small. Throughout your degree, you have been developing an understanding and skills that enable you to work at a high level in these environments. You are soon to graduate and follow your career path. This assessment gives you an opportunity to reflect on how far you have come and where you wish to go.

Task Instructions

Write a 1500 words professional practice reflection addressing the following questions:

• Complete the skills audit to identify your level of skills and knowledge with compared to the nationally recognised ICT skills set needed for a career in ICT.

• Identify your two strongest skills and reflect on why those are your strongest. Link this to your experiences in the MBIS and your life in general (for example you might also be influenced by family, work experience, previous study, hobbies, volunteer work etc.)

• Identify the two skills that need the most work. As above, link this discussion to your experiences in the MBIS and your life in general.

• Now that you are about to graduate, how do these skills match the job/career that you want?

Find a job advertisement and match the skills and knowledge wanted in the advertisement against your audit. What do you need to work on?

• Finally, you have identified your skills levels and how it relates to your career, now choose three skills and set yourself SMART* goals for the next six months to achieve your career objectives.



From a traditional viewpoint, the job of IT professionals is linked with developing software for solving complex problems on a computer. Information systems have been evolved over the years with the development of technology, the Internet, big data, and others. The emerging technologies have opened many career opportunities which include Chief Information Officer, Chief Technology Officer, and cybersecurity specialists, business analysts, and others. Rapid changes in the industry are one of the hallmarks within ICT domains which is prominent at our micro and macro levels. For Assignment Help Disruptive technologies like IoT and Big data have implied many changes within the ‘macro level’ due to which job opportunities have increased from the last few decades. Technical information is a system that allows students like us to work on the Internet efficiently while connecting globally (Intersoft consulting, 2016). The knowledge of various developed software will help in gaining information on varied domains thus enhancing professional knowledge life. This report aims in evaluating interests and knowledge on ICT career opportunities. Thus, conducting an audit on skills to recognize strengths and weaknesses can be improved for substantial career growth. This will strengthen skills and help students like us to grab adequate and good career opportunities.

Reflection report:

Completion of the audit skills:

Since the ICT domain witnesses, rapid changes hence skill audits help in recognizing. weaknesses and strengthswhich will help in attaining a promising career in ICT domains. I have a deep interest in networks and computer systems that further protects computer systems from unauthorized access or cybercrimes. It has become important to possess advanced skills in ICT domains for protecting all types of electronic devices from the breach. Thus,I have acquired additional knowledge from attending webinars, participating in class activities along studying through module lectures. I assume that my knowledge base in cyber-threat activities is strengthened that has been benefitting me in safeguarding my electronic devices while I am also keen to apply and broadened my information base for protecting the ICT infrastructure of organizations. As per the skills audit done by me, I have assumed that my knowledge of IT management, data management, IT infrastructure is strongest. However, to sustain rapid changes in the domain and HR strategies, I require to polish my skills for sustaining in the job market. (Australian Computer Society, 2014).

Identification of the two strongest skills:

Proper identification of own strength is necessary for achieving success in personal and professional life. Any person can enhance their scope and range of opportunities by adding value to their strongest skills (S. Tufféry, & S. Tufféry, 2011). As I have been studying the Information system specifically through a business perspective, I have generally mastered a wide variety of disciplines properly. But amongst them, I have preferred, Data and information management along with Software Development as my strongest set of competencies. These two skills have been providing have been helping me in completing my MBIS degree.

Data and information management is based on the managementquality and storage of both structured and unstructured information properly. It also includes integrity and security for providing support regarding digital services in business management (Hall &Wilcox, 2019). Besides, it also helps the business units to strategize business plans and policies for future investment purposes. I have also mastered web development and software development, as this skill has been my hobby for some time. I have also done several freelancing jobs on web and software development purposes due to which I can affirm that the aforementioned skillsare the main pillars for my future professional aspects.

Identification of the skills required for improvement purposes:

In this present digitalized era, for achieving a successful professional life, flexibility is very important without which the scope and opportunities of professionalcareerstend to be narrower after a certain period(Hunt, 2018). Also, flexibility supports the adoption of varied job purposes along with technological assistance. Currently, I lack skills in IT infrastructure and Network support skills due to which I faced many issues while completing assessments and understanding activities during my MBIS course. I also struggle in proper management and regulation of the hardware-related aspects, which also encompasses the problems regarding changed services. I have several examples of failures in support services to the customers while working as a software technician in a small size business organization during my volunteer work. I was also unable to resolve many problematic aspects which dissatisfied the customers. Network support is another important set of skills that needed to be enhanced for me.

The utilization of the skills towards professional career:

As per my skills, audit, and identification of strengths during my MBIS degree, a web developer or web designer would be the preferable and best-suited job for me. Hence, I have searched for the business companies that specifically advertise for a web developer or web designer in their organization. ENTTEC has been opted by me, as they are currently facing a high shortage of web developers. The company operates through its offices at Keysborough VIC, Australia. The company has advertised that the web developer will be required for updating and creating innovative software with related components like websites and software applications. The IT products and services offered by the company acts as the revenue for which management and regulation are of utmost importance. However, the company has strong recommendations for soft skills sets like teamwork, communication,crisis management which I need to polish.

Also, I have undergone other job searches as well, which demonstrates similar skills, however, requires higher work experience. Hence strengthening my skills will help in obtaining internships which will increase my professional expertise that will further enable me to work in the esteemed organization of my desired professional career.

SMART goals for the next six months:

Although the skills of web development along with data and information management are my strongest skills, however, I require polishing of competencies on infrastructure perspectives.

In addition to this, I would also like to improve my communication, teamwork, and forms of interpersonal skills for sustaining in highly competitive ICT Organizations. Also, I realize that strengthening ICT competencies is important as advanced technology is being rapidly applied by all organizations, hence it provides a good opportunity with high job satisfaction as my job will help many companies to progress with efficiencies (Mainga, 2017).


The report has helped me monitor my strengths and weaknesses in this domain, which has helped many career developments. Through the audits, I obtained information that must improve my skills in IT management data and administration for which I have designed SMART goals for six months. In addition to this, I recognized that IT domains undergo rapid change due to which possession of interpersonal skills is important. In recent times, the business environments are highly competitive which requires a balance of both personal and professional skills. Students like us, who will be stepping into the professional world are required for obtaining information on HR strategies and practices, hence job adverts are helpful tools in polishing skills for a higher opportunity of being recruited in the organizations.Also, the skill audit has enlightened me that I require continuous improvement for assessing strengths and weaknesses, hence I would continue this type of reflection in my professional career as well. My skills should be strengthened so that they can be used especially in the job-oriented process later in the future while I can also explore my job role and related skills for reaping benefits in different developmental domains of ICT.



Read More

MIS607 Cybersecurity Report Sample

Task Summary

You are required write a 1500 words Threat modelling report in response to a case scenario by identifying the threat types and key factors involved. This assessment is intended to build your fundamental understanding of these key threats so that you will be able to respond/mitigate those factors in Assessment 3. In doing so, this assessment will formatively develop the knowledge required for you to complete Assessment 3 successfully.


Security threat modelling, or threat modelling is a process of assessing and documenting a system's security risks. Threat modelling is a repeatable process that helps you find and mitigate all of the threats to your products/services. It contributes to the risk management process because threats to software and infrastructure are risks to the user and environment deploying the software. As a professional, your role will require you to understand the most at-risk components and create awareness among the staff of such high-risk components and how to manage them. Having a working understanding of these concepts will enable you to uncover threats to the system before the system is committed to code.

Task Instructions

1. Carefully read the attached the case scenario to understand the concepts being discussed in the case.

2. Review your subject notes to establish the relevant area of investigation that applies to the case. Re- read any relevant readings that have been recommended in the case area in modules. Plan how you will structure your ideas for the threat model report.

3. Draw a use DFDs (Data Flow Diagrams):

• Include processes, data stores, data flows
• Include trust boundaries (Add trust boundaries that intersect data flows)
• Iterate over processes, data stores, and see where they need to be broken down
• Enumerate assumptions, dependencies
• Number everything (if manual)
• Determine the threat types that might impact your system
• STRIDE/Element: Identifying threats to the system.
• Understanding the threats (threat, property, definition)

4. The report should consist of the following structure:

A title page with subject code and name, assignment title, student’s name, student number, and lecturer’s name. The introduction that will also serve as your statement of purpose for the report. This means that you will tell the reader what you are going to cover in your report. You will need to inform the reader of:

a) Your area of research and its context

b) The key concepts of cybersecurity you will be addressing and why you are drawing the threat model

c) What the reader can expect to find in the body of the report

The body of the report) will need to respond to the specific requirements of the case study. It is advised that you use the case study to assist you in structuring the threat model report, drawing DFD and presenting the diagram by means of subheadings in the body of the report.

The conclusion will summarise any findings or recommendations that the report puts forward regarding the concepts covered in the report.

5. Format of the report

The report should use font Arial or Calibri 11 point, be line spaced at 1.5 for ease of reading, and have page numbers on the bottom of each page. If diagrams or tables are used, due attention should be given to pagination to avoid loss of meaning and continuity by unnecessarily splitting information over two pages. Diagrams must carry the appropriate captioning.

6. Referencing

There are requirements for referencing this report using APA style for citing and referencing research. It is expected that you used 10 external references in the relevant subject area based on readings and further research. Please see more information on referencing here:

7. You are strongly advised to read the rubric, which is an evaluation guide with criteria for grading the assignment. This will give you a clear picture of what a successful report looks like.


1. Introduction

The report will develop a threat model for solving the issues of cyber risk in Business & Communication Insurance company. Cybersecurity management is essential for risk identification, analysis, and mitigation (Mulligan, & Schneider, 2011). Cyber security management plays crucial roles for building cyber resilience by minimizing the threats (Ferdinand, 2015). For Assignment Help, The B&C Insurance company is under the threat of information hacking as a ransom email from an unknown source has come to the CEO company where the hackers claimed that they have the details of 200,000 clients of the company and as proof, they have attached a sample of 200 clients. The report will identify the risk factors and "at-risk" elements to develop a threat model using the STRIDE framework to mitigate the risk associated with cyber hacking in B&C Insurance company. For identifying the potential risks, their impacts and to suggest proper mitigation of the cyber threats, the threat model will be developed and the DFD diagram will be drawn to explore the risk factors and mitigation strategy related to the case study of B&C Insurance company.

2. Response to the specificrequirements

2.1. Types of threat and major factors involved

The B&C Insurance company can be under the threat of various types of cyberattacks. The different types of threats increase the potentiality of information risks where the aid of cybersecurity management is required (Reuvid, 2018). As the B&C Insurance company is a private firm, the possibility of malware attacks is high. The ransom email from the unauthorized source confirms that the sample of 200 clients is genuine which was investigated by the forensic computer specialists. Therefore, the risk lies in the information of the 200,000 clients of the company which was hacked by an unknown source. The type of attack is ransomware. Some of the potent threats that businesses face is ransomware, malware, DDoS attacks and others (Russell, 2017). As the hacker uses a ransom email, it can be possible that the threat lies in a malware attack.

The network, system, and user are the three factors that are prone to high risk. Within the company B&C Insurance, the insecure network can cause a risk of information hacking where confidential information can be hacked by an unknown source. Security of user information lies in the secret authentication process (Antonucci, 2017). The employees within the company can unknowingly share confidential data while giving access to any source. A similar incident can happen in the case of customers of the company. However, the vulnerability also lies in the system where data integrity is required for system management.

Other possible attacks are phishing and spoofing where attackers can target the employees of the company. The trap of fraudulent tricks can take the access of information from the employees. The clients can also be tricked where they are believed that the access is provided from an authorized source.

2.2. Threat Modeling using STRIDE framework

The threat modeling framework helps to manage cybersecurity by analyzing the risks, their impact and proposing the mitigation strategy to tackle the risks (Xiong, &Lagerström, 2019). Implementation of STRIDE framework in threat modeling process specifies the threats and keeps the integrity, confidentiality, and availability of information. However, the STRIDE framework will help to ensure the security of the information in B&C Insurance company by implementing the strategy for threat detection, evaluation, and mitigation. The six steps of the STRIDE model will be implemented to resolve the cyber risks within B&C Insurance company.


Table 1: STRIDE
Source: (Developed by the author)

2.3. Other Models and Frameworks of Threat modeling

The other suitable models may help the company to manage the risks in the information system.
The DREAD framework is capable of deriving a threat intelligence solution where it implements the appropriate rating systems for risk assessment, analysis, and development of risk probabilities (Omotosho, Ayemlo Haruna, &MikailOlaniyi, 2019). Through the information collection process, the DREAD framework rates the potential risks from low to medium to high. It allows the users to identify the threat for proper mitigation plan development. The B&C Insurance company can use the DREAD model for risk identification, analysis, and rating system development.

The NIST model of security management helps to set specific guidelines for managing the risks through threat detection and responding to cyber-attacks. It helps to manage the risks by generating a strategy for risk prevention. The cybersecurity framework of NIST can be implemented in B&C Insurance company for the identification of the type of threat and then the development of risk mitigation strategy. The framework can promote the organization to manage cyber threats by setting proper guidelines for cybersecurity management.

2.4. Data Flow Diagram

At-risk components

The health insurance company B&C has a record of its client's information related to health. Other information of the clients may include the personal details, demographic information, financial information, and family information of the clients. Risks can occur in the information of the clients where the hackers can steal the confidential information of the clients for misuse. Cyber risks increase the vulnerability in the information system (Stephens, 2020). The employees within the organization are also at risk of cyber hacking. The basic details of employees, their salary status, and family background are prone to high risk. The information of the system within the B&C Insurance company is a valuable asset that is under cyber threat. Moreover, the risk can also occur in the networking system where the information can be hacked by an unknown source. Therefore, it is essential to safeguard the at-risk components in the organization.


Figure 1: Context diagram

Figure 2: Level-1 Diagram

2.5. Security Strategies

The B&C Insurance company needs to safeguard its information and system from cyber attacks. For managing information security, the company needs to take the following actions.

· The data encryption process will help to control the access of users where using biometric or access control lists can be effective.

· Antivirus, network security control tools, anti-malware, and anti-phishing tools can be implemented to manage the proper security of the system. Installing an automated security tool can also be helpful.

· Access control and user authentication through proper password development is also an effective technique for managing information security (Stallings, & Brown, 2018).

· Security control measures like proxy firewalls can help in managing the security of systems (Durieux, Hamadi, &Monperrus, 2020).

· Training of the staff regarding security management is required to reduce the risk of phishing and spoofing.

3. Conclusion

Cybersecurity management can be possible through developing the thread model. The STRIDE framework will help B&C Insurance company to effectively manage the information system. The model implementation will also help to identify the potential risks, analyze the risks and mitigate them. However, the identification of the at-risk components will help the company to understand the underlying vulnerability within the information system of the company. The identified risk factors have contributed to drawing the DFD diagram where the application of the STRIDE framework has created potential solutions for security risk management. Moreover, the alternative models and the security risk strategy will also help the company to manage the future risks in an information system.


Read More

MIS608 Agile Project Management Report Sample

Task Summary

You are required to write an individual research report of 1500 words to demonstrate your understanding of the origins and foundations of Agile by addressing the following areas:

1. The origins of Agile – why did Agile emerge, what was it in response to, and how did this lead to the values and principles as outlined in the agile manifesto?

2. The origins of Lean and how it has influenced Agile practice

3. The similarities and differences between Scrum and Kanban as work methods

4. Why adopting Agile benefits an organization.

Please refer to the Task Instructions for details on how to complete this task.

Task Instructions

1. Write a 1500 words research report to demonstrate your understanding of the origins and foundations of Agile by addressing the following areas:

• The origins of Agile – why did Agile emerge, what was it in response to, and how did this lead to the values and principles as outlined in the agile manifesto?

• The origins of Lean and how it has influenced Agile practice.

• The similarities and differences between Scrum and Kanban as work methods.

• Why adopting Agile benefits an organisation.

2. Review your subject notes to establish the relevant area of investigation that applies to the case. Perform additional research in the area of investigation and select FIVE (5) additional sources which will add value to your report in the relevant area of investigation.

3. Plan how you will structure your ideas for the report. Write a report plan before you start writing. The report should be 1500 words. Therefore, it does not require an executive summary nor an abstract.

4. The report should consist of the following structure:

A title page with subject code and name, assignment title, student’s name, student number, and lecturer’s name.

The introduction (100 – 150 words) that will also serve as your statement of purpose for the report—this means that you will tell the reader what you are going to cover in your report.

You will need to inform the reader of:

a. Your area of research and its context
b. The key concepts you will be addressing
c. What the reader can expect to find in the body of the report

The body of the report (1200-1300 words) will need to cover four specific areas:

a) Why did Agile originate? When did it emerge and what was it in response to? How did this lead to the four values and 12 principles that are outline by the agile manifesto?

b) Where did Lean originate? Briefly define what Lean is and two Lean philosophies
have been adopted in the evolution of Agile practice?

c) Scrum and Kanban have many similarities, but also key differences. Compare and contrast Scrum and Kanban with each other, illustrating these similarities and differences with examples.

d) Explain what value adopting Agile can offer to an organisation.
The conclusion (100-150 words) will summarise any findings or recommendations that the report puts forward regarding the concepts covered in the report.

5. Format of the report

The report should use font Arial or Calibri 11 point, be line spaced at 1.5 for ease of reading, and have page numbers on the bottom of each page. If diagrams or tables are used, due attention should be given to pagination to avoid loss of meaning and continuity by unnecessarily splitting information over two pages. Diagrams must carry the appropriate captioning.

6. Referencing

There are requirements for referencing this report using APA referencing style. It is expected that you reference any lecture notes used and five additional sources in the relevant subject area based on readings and further research.



The business enterprises are always facing changes in the external environment and adopting an agile approach in project development helps them respond to change effectively. Understanding of concepts like Scrum, agile, lean and Kanban will; be helpful for proceeding in this subject. For Assignment Help In this paper the agile and its origin and how the agile manifesto was formed are discussed in depth. Along with this the audience will read in the body of this report more discussions about lean, scrum and Kanban concepts used in software development. Such in-depth knowledge will not just help pupils in software development but also can be used in other areas in workplaces in future.

Agile, its Origination & Events that led to the Agile Manifesto

The agile method had originated in the industry of software development much before the manifesto was formed (Nadeem & Lee, 2019, p. 963). Most software development projects which were conducted previous to the agile manifesto formation were taking a very long time for finishing them. An innovative, new and effective approach was necessary by the industry and the consumers. In the early 1990s the industry of software development faced crisis of application delays or lags. The time used to fulfill the need of an application in the market was huge. Thirty years back one had to wait for years for solving a problem in the business of software development, production, and aerospace and in defense. The waterfall method was first manifested in this time where all phases were defined clearly in lifecycle of a project. As its name suggests it is a process where teams will finish one step completely and then start doing the next (Chari & Agrawal, 2018, p. 165). When a stage in such projects was completed it would be frozen for a very long time too. This method was not effective at all. Rarely did a software development project experience stability of the same kind. The need towards iterative development began because the teams needed to conduct activities, measure them, make change as needed and again improve.

Many software developers out of frustrated began making changes to their approach of development during the 1990s. Different methods such as Scrum, DSDM, Rapid Application Development or RAD emerged during this time. A group of software engineers met in year 2001 at Utah at a ski resort with an intention to solve problems such as delays in project delivery or bridging the disparities between expectations and delivered products. There was a pressing need to ensure that the software development time was less and the product reached the end user faster. Two things were identified in this conference. First, delay shortening will make products market fit and secondly with faster feedback from the consumer the teams can make continuous betterment of the products. Within a year from this conference meet the developers met again and formed the agile manifesto. The manifesto has laid out values and principles which gave the industry of software development a new traction and power (Hohl et al., 2018, p.1).

Origination of Lean Methodology

The lean as per history began much before the era where software development first began. Lean rather initiated itself in the Japanese factory of Toyota which used to make automobiles. In the year 1950s and in the 1960s Taiichi Ohno had developed TPS or the Toyota Production System and aimed to enhance the loss and enhance sustainable means of automobile production. Visual signals were utilized for producing the inventory as was needed. This was technically known as a just in time production process and it focused primarily over minimizing the wastage and optimizing all the production processes. Manufacturers in the west were struggling to be at par with the speed of manufacturing by Japanese organizations and hence soon they began using lean manufacturing processes (Rafique et al., 2019, p. 386). The lean guiding principles made easier implementation and major IT companies began adopting it as a result.

Lean can be defined as an approach of management which is supporting the continuous improvement model. The process aims in organizing the human actions for delivering value and eliminating waste.

There are many similarities in the concepts of agile and lean thinking. Blending the philosophies of lean into agile innovative work processes is formed. By blending the best of these two methodologies businesses are moving faster and developing better quality and forming healthy and sustainable work environments. Two philosophies of the lean methodology are used in agile practices.

Build-measure and learn: The build, measure and learn principle used in lean methodology is used in the agile (Kane, 2020, p. 1561). The agile and its iterative approach is based on this very lean principle which encourages testing, measurement and validation on the basis of past work and market trends. The lean always focuses on finding the way which offers maximum value to the customer.

Waste elimination: The philosophy of lean to eliminate waste is adopted in agile. Teams in agile pulls the work which is of the highest priority and they iterate and delivers it progressively. Continuously they are learning and improving to see that nothing is unused or wasted.

Scrum and Kanban Point of Similarities

Kanban is a methodology used for the optimization and management of the workflow where one can visualize the entire processes with the aid of tools such as the Kanban Board and teams can work continuously. The Scrum methodology is another framework used where in-depth and restrictive planning is of importance. There are many similar characteristics in between the Scrum and the Kanban. These popular methodologies are used in many organizations. Following are the point of similarities which are observed between them:

1. The methodologies are lean as well as agile in their approach.
2. The goal in these methods is also to restrict the work in progress.
3. They both use the pull scheduling for moving work faster.
4. Scrum and Kanban both breaks the work down.
5. Both these methods are focused on teams which are organised.
6. Software targeted by both these methodologies is reusable in nature.
7. Both the methodologies utilises transparency as a tool to process the continuous improvement (Lei, Ganjeizadeh, Jayachandran, et al., 2017, p. 59).

Scrum and Kanban Point of Differences

Agile advantages to the enterprise

A large number of organizations are moving towards agile development as it offers an environment which is evolving constantly.

Beat competition: The consumers, regulators and the business partners all have needs which are pressing. Stakeholders in business demand products and services which help them beat competition (Mckew, 2018, p. 22). This involves fast changing goals, quick restructuring and team adaptability.

Integrate innovation: Moreover with an agile approach organizations can encourage used of new technologies which helps them enhance their overall efficiency and performance (Potdar Routroy, S., & Behera, 2017, p. 2022).

Stakeholder engagement: Before the sprint process, during the process and after the sprint the stakeholders collaborate. With working software released to the client in intervals makes the entire tam come together with a shared goal in mind. Such teams display high involvement in the enterprise in general.

Forecast delivery better: In agile progress of the project is significant. At times the companies even make beta release of the software thus increasing the overall value of the business. Agile use can provide the team an opportunity for make delivery predictions accurately which satisfies the customer.

Element of transparency: The agile use gives organization the golden opportunity to work with the consumer during the development phase. The customer is aware of the features of the product being developed and gives feedback (Kalenda, Hyna & Rossi, 2018, p. 30). All the parties engaged in the development process in agile enjoys a high level of awareness and transparency.

Change opportunities: Due to the iterative approach of agile there are ample scopes for making changes. Minimum resistance comes from the workforce because they are already accustomed with the element of change.


The world is going through a major digital shift. Businesses in every industry are integrating new technologies and processes. Staying forefront to changing environment is important for survival. The concepts showcased in this paper about agile and used of its values and principles are indeed valuable for businesses. It is recognized that agile is the most suitable methodology which can be applied to projects, product development and also to man power management. Through agile, mangers can detect problems, find solutions and implement them fast. A recommendation to such dynamic thinking where more importance is given over solutions will no doubt helps enterprise achieve sustainable success.


Read More

DATA4000 Introduction to Business Analytics Report 3 Sample

Your Task

Consider below information regarding the Capital One data breach. Read the case study carefully and using the resources listed, together with your own research, complete:

• Part A (Industry Report) Individually by Monday 23: 55pm AEDT Week 12
Assessment Description

Capital One


Who is Capital One?

Capital One Financial Corporation is an American bank holding company specializing in credit cards, auto loans, banking, and savings accounts. The bank has 755 branches including 30 café style locations and 2,000 ATMs. It is ranked 97th on the Fortune 500, 17th on Fortune's 100 Best Companies to Work For list, and conducts business in the United States, Canada, and the United Kingdom. The company helped pioneer the mass marketing of credit cards in the 1990s. In 2016, it was the 5th largest credit card issuer by purchase volume, after American Express, JPMorgan Chase, Bank of America, and Citigroup. The company's three divisions are credit cards, consumer banking and commercial banking. In the fourth quarter of 2018, 75% of the company's revenues were from credit cards, 14% were from consumer banking, and 11% were from commercial banking.


Capital One is the fifth largest consumer bank in the U.S. and eighth largest bank overall(Capital One, 2020), with approximately 50 thousand employees and 28 billion US dollars in revenue in 2018(Capital One, 2019).Capital One works in a highly regulated industry, and the company abides to existing regulations, as stated by them: “The Director Independence Standards are intended to comply with the New York Stock Exchange (“NYSE”) corporate governance rules, the Sarbanes-Oxley Act of 2002, the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010, and the implementing rules of the Securities and Exchange Commission (SEC) thereunder (or any other legal or regulatory requirements, as applicable)”(Capital One, 2019). In addition, Capital One is a member of the Financial Services Sector Coordinating Council (FSSCC), the organization responsible for proposing improvements in the Cybersecurity framework. Capital One is an organization that values the use of technology and it is a leading U.S. bank in terms of early adoption of cloud computing technologies. According to its 2018 annual investor report (Capital One, 2019), Capital One considers that “We’re Building a Technology Company that Does Banking”. Within this mindset, the company points out that “For years, we have been building a leading technology company (...). Today, 85% of our technology workforce are engineers. Capital One has embraced advanced technology strategies and modern data environments. We have adopted agile management practices, (...).We harness highly flexible APIs and use microservices to deliver and deploy software. We've been building APIs for years, and today we have thousands that serves as the backbone for billions of customer transactions every year.” In addition, the report highlights that “The vast majority of our operating and customer-facing applications operate in the cloud(...).”Capital One was one of the first banks in the world to invest in migrating their on-premise datacenters to a cloud computing environment, which was impacted by the data leak incident in 2019.

Indeed, Amazon lists Capital One migration to their cloud computing services as a renowned case study. Since 2014, Capital One has been expanding the use of cloud computing environments for key financial services and has set a roadmap to reduce its datacenter footprint. From 8 data centers in 2014, the last 3 are expected to be decommissioned by 2020, reducing or eliminating the cost of running on-premise datacenters and servers. In addition, Capital One worked closely with AWS to develop a security model to enable operating more securely. According to George Brady, executive vice president at Capital One, Assessment Instructions

Part A: Industry Report - Individual

Based on the readings provided in this outline, combined with your own independent research, you are required to evaluate the implications of legislation such as GDPR on the Capital One’s business model. The structure of your report should be as follows.
Your report needs to be structured in line with the Kaplan Business School Report Writing Guide and address the following areas:

• Data Usability

- Benefits and costs of the database to its stakeholders.
- Descriptive, predictive and prescriptive applications of the data available and the data analytics software tools this would require.

• Data Security and privacy

- Data security, privacy and accuracy issues associated with the database.

• Ethical Considerations

- The ethical considerations behind whether the user has the option to opt in or opt out of having their data stored.
- Other ethical issues of gathering, maintaining and using the data.

• Artificial Intelligence

- How developments in AI intersects with data security, privacy and ethics.

• Use the resources provided as well as your own research to assist with data collection and data privacy discussions.



Capital One Financial Corporation is known to be a popular American bank holding company. It is capitalizing in credit card, auto loans and saving accounts. The bank does come with branch strengths of 755 inclusive of 30 café style locations along 2,000 number of ATMs. For Assignment Help, This particular financial company has been ranked as 97th on the fortune list of 500 and even 17th on fortune out of 100 best companies to work. The company has even helped to become pioneer in terms of mass marketing of credit cards in the year 1990. In the year 2016, it has positioned as 5th largest card issuer in terms of purchase volume(GDPR Compliance Australia. 2021). Capital one is a firm which tend to value the use of technology and has evolved as leading U.S bank with respect to adaptation of cloud computing(Greve, Masuch and Trang 2020 p 1280). Amazon has listed Capital One migration right into their cloud computing environment for some of vital financial services. In addition, it does come up with a roadmap for reducing its data centre footprint in near future.

In the coming pages, an evaluation has been regarding implication of legislation on GDPR on the capital One’s business model. The next section reveals data usability, data security and privacy, ethical consideration and artificial intelligence.

Data Usability

Benefits and costs of database to its stakeholders

Database is considered to be an important tool for handling various digital processes within the system. The main aspect of storing, organization and analysis of critical data of business for different aspects like staff, customer, accounts, payroll and inventory. Database management system will allow access to various kinds of data. This will again help in creating and management of huge amount of information with respect to use of single software(Novaes Neto 2021). Data consistency is also ensured within database as there is no kind of data redundancy. All the required data appears to be much more consistent in nature with database. Here data is found to be same for various users to view the database(Lending, Minnick and Schorno 2018 p 420). Any kind of changes within database are reflected on immediate basis for users and thereby ensuring that there is no kind of data consistency.

Database do automatically take care of both recovery and backup. User’s does not need any kind of backup of data on periodic basis due to the point that care is taken by database system. In addition, it also helps in restoring database right after or even failure of system. Data integrity helps in ensuring that store data is accurate and consistent within database(Spriggs, 2021). The fact should be noted that data integrity is vital parameter when data is accurate and consistent within database. It can be noted that database do comprises of data being visible to range of users(Vemprala, and Dietrich 2019). As a result, there is need for ensuring that collected data is correct and consistent in nature for database and its users.

Descriptive, predictive and prescriptive applications of the data

Capital One can make use of analytics to explore and examination of data. It will help in transform findings into the insights which can help manager, executives and operational employees to go for informed decision-making. Descriptive analytics is being commonly used as form of data analysis where various kind of historical data is collected(GDPR Compliance Australia. 2021). After this, the collected data is organized and presented in such a way that it is easily understandable. Predictive analytics is focused on predictive and proper understanding what can happen in near future. Predictive analytics is completely based on probabilities.

There are wide range of statistical methods which can be use for this purpose named as descriptive statistics. It does comprise of numerical and graphical tools which can help in summarizing collection of data and collect some of the vital information. IBM SPSS is known to be predictive analytics platform which can again help user to build right predictive model at quick rate(Giwah et al. 2019). All this will be used for delivery of predictive intelligence to groups, system, individual and different enterprises. Predictive analytics software tools do come up with some advanced analytical capabilities like text analysis, real-time analysis and statistical analysis, machine learning and optimization.

Data Security and privacy

Database Security Issues: Database can be easily hacked by making use of flaws within their feature. Hackers can easily break direct into legitimate and compel the given system by making use of arbitrary code. Although it is found to be bit complex one, the access can be collected by using some of basic flaws to accept features. Database can help in providing protection from the third party by using security testing(Poyraz et al. 2020). The database can easily protect from third party access by making use of security testing. The function of database is simple one as there is chance for ensuring proper protection from each of the database feature.

Data privacy: Increased in use of personal data puts the data privacy at the top on business risk management. It is found to be an acceptable challenge and even dangerous to completely ignore. Breaching GDPR and other kind of similar regulation like CCPA and HIPAA do come up with up some hefty fines(Rosati et al. 2019 pp 460). The damage to reputation can be biggest kind of threat to the business and even create a career limit on the blot for IT resume manager. Data privacy can be easily tracked down into IT security or disaster recovery plan. But this is not good enough due to the fact that data privacy aim to touch various section of the business.

Data Accuracy:Business on global platform is increasing their lean on their data so that they can power day-to-day operations. It will help in providing enhanced data management being top directive for some of leading companies. Secondly, entry of three section blog series will help in looking towards more leading blog series. There is need for considering some of the popular data management challenges with cloud deployment, integration of various data sources along with maintaining accuracy(Zou et al. 2019 pp. 10). It emphasizes on data without any kind of disruptive analytics along with neglecting any possible impact. Issues related to data availability and security plague for enterprise of different size all around the verticals.

Ethical Considerations

In the infancy age of internet, data protection laws were created. Right before the advent of social media, no has ever heard about the term big data. General data protection regulation (GDPR) does come into effect right in the month of May. It will provide overhaul for legal framework for providing privacy and protection to some of personal data all across EU(Truong et al. 2019 pp, 1750). GDPR is getting much of attention due to the fact that organization aim to impose personal data which will help in to comply.

An opt-in completely depends on person so that they can actively reveal exact data about them which can be used properly. In general, the lighter touch in comparison to informed consent approach. On the contrary, opt-out system is possibility to result in much high coverage within population. It will provide the option of default assumption so that people can feel about data which can be used. It is possibly that only few people can actively take part for step-out(Goddard, 2017 p 704). It is more specifically to diverse population which has changing ethnicities as best tool for providing prevention. Some of the ethical responsibility for data which this patient can individually manage for opt-in depended system.

Taking into account opt-out availability,Capital One needs to consider some of the important points:

• It will help in providing meaningful, easy availability, together with clear information so that people can have well-informed regarding choices.

• Customers are not that much disadvantages if they come up decision to opt out.

• There is need for good and robust governance regarding data use. It does comprise of some independent sight and ability for auditing(Wachter, Mittelstadt and Russell 2017 p 841). This will ensure that data can be enhanced in certain methods.

Artificial Intelligence

Security, privacy and ethics are found to be of low priority problem for developers when modelling the machine learning solution. Security is considered as one of the serious blind spots. Here the respondents reveal that they do not check for any kind of security vulnerabilities at the instance of modelling building. There are various kind of regulations which tend to cover some vital areas of data protections. It will again have certain clause in relation to artificial intelligence(Zhang et al. 2021 p.106994). AI governance aim to post GDPR lessons learned and road ahead with a number of key areas to tackle to be identified to tackled down AI and privacy. It aims to list out on the following areas namely

• Incentivising compliance centred innovation AI.
• Empowering civil society by making use of AI
• Enhancing interoperability for AI-based governance structure.

European GDPR is such that law which does have special pull-on artificial intelligence to set out requirements which do comprises of its use. The report encourages both local and international level for resolving the possible challenges related to AI governance within privacy as being privacy being contextual in nature.

. This is found to be useful for manufacturing as it accepts latest technologies (Weinberg 2020). Due to the nature of this nature of technology, an individual needs more data so that they can enhance efficiency and smartness. In order to so, it creates certain number of privacy and ethical issues which needs to be dealt with policy along with careful design solution. Centre of Data Ethics and innovation will help in reducing any kind of barriers related to acceptance of artificial intelligence within society. Three areas namely business, citizen and public sector which require some clear set of rules and structure for providing safety and ethical innovation within data and artificial intelligence (AI)(Tomás and Teixeira 2020 pp. 220). Artificial intelligence depended solution will evolve to become much more ubiquitous in nature in upcoming days. As a result, there is need for acting ways to check that these solutions tend to evolve in ethical and privacy protecting ways.


From the above pages, it can be noted that this report is all about Capital One. Even Amazon has listed capital one migration in their cloud computing as the reputed case study. Capital One has evolved as one of prior banks on global platform so that they can invest to migrate its data centre on cloud computing-based environment. The report aims to evaluate possible implication of legislation like GDPR on Capital One business model. There are mainly four sections covered on report like data usability, data security and privacy, ethical consideration and artificial intelligence. The report covers data security, privacy and accuracy problem in relation to database. In the last section, an overview has been given how artificial intelligence aim to intersect with data security, ethics and privacy.


Read More

Data4400 Data Driven Decision Making and Forecasting IT Report Sample

Your Task

Apply forecasting techniques to a given dataset and provide a business application of the forecasts. The report is worth 30 marks (see rubric for allocation of these marks).

Assessment Description

A dataset from a retailer that has more than 45 stores in different regions (Public data from Kaggle) has been sourced. The data provided for the assessment represents two stores. Store number 20 has the highest revenue within the country and store 25 does not have a high volume of sales. The objective of the assessment is to develop different demand forecast models for these stores and compare the forecast models in terms of accuracy, trend, and seasonality alignment with the historical data provided. Students must use visual inspection, error metrics and information criteria on the test data to provide conclusions.

Assessment Instructions

In class: You will be presented with a dataset in class. As a group, analyse the dataset using Tableau and Exploratory.io. You will provide an oral presentation of the group work in parts A to C during the third hour of the workshop.

The data set will be posted or emailed to you at the beginning of class in Week 6.

After Class: Individually write a 1000-word report which briefly summarises the analysis and provides suggestions for further analysis. This component of the assessment is to be submitted via Turnitin in by Tuesday of week 7. No marks will be awarded for the assessment unless this report is submitted.

Hint: take notes during the group assessment to use as prompts for your report.As a group:

Part A

- Use Tableau to compare the two stores in terms of sales using adequate visualisation(s).
- Run Holt-Winters forecasts of the next 5 months for stores 20 and 25.
analyse the results of the forecasts in terms of:
o Accuracy
o Alignment with the historical trend
o Alignment with the historical seasonality

Part B

- Use Exploratory to generate ARIMA forecasts for stores 20 and 25.
- Create visualisations, interpret and describe your findings.
- Analyse the forecasts in terms of:
o Accuracy
o Alignment with the historical trend.
o Alignment with the historical seasonality.

Part C

Prepare a presentation:
• Include key findings.
• Highlight methodologies.
• Advise which methods to use for each store.
• Recommend improvements in terms of forecasting for the retailer.

Note: All members of the group should be involved in the presentation. The allocated time for the presentation will be decided by your lecturer.



The ability for organisations to base decisions on empirical evidence rather than preconceptions makes data-driven decision-making and forecasting essential. For Assignment Help, Forecasting trends help proactive tactics, resource optimisation, and market leadership in fast-moving environments. With the aid of various forecasting models, including ARIMA, HOLT-WINTERS, and others, the study's goal is to visualise the sales of both STORE 20 and STORE 25 and forecast sales based on historical sales trends.
Discussion on Results

Figure 1: Visualization of STORE 25 sales

Figure 2: Forecast result of STORE 25 sales

With a decline of 336,738 units from the initial figure of 3,149,931 units in October 2012, the projection for STORE 25 sales from October 2012 to February 2013 shows a downward trend. With a peak in December 2012 (1,616,475) and a trough in January 2013 (-563,853 units), the seasonal effect is clear.

Figure 3: Sum of value for STORE 25 sales

It appears reasonable to use the selected additive model for level, trend, and seasonality. The forecast's accuracy is fairly high, with a low MAPE of 10.8%, despite occasional forecast errors, as seen by measures like RMSE (383,833) and MAE (296,225). This shows that the model effectively captures underlying patterns, assisting in the formulation of successful decisions for the STORE 25 sales strategy.

Figure 4: Visualization of STORE 20 sales

Figure 5: Forecast result of STORE 20 sales

A time series methodology was used to determine the sales prediction for STORE 20 for the period from October 2012 to February 2013. Notably, it was clear that an additive model for level and trend had been used and that there was no identifiable seasonal regularity. Sales began at roughly $9.88 million in October 2012, and by February 2013, they had increased by $197,857.

Figure 6: Sum of value for STORE 20 sales

Quality metrics showed an RMSE of $1.3 million and a fair degree of accuracy. The forecast's relative accuracy may be seen in the forecast's mean absolute percentage error (MAPE), which was 12.4%. STORE 20's sales trend could be understood by the chosen model despite the lack of a pronounced seasonal effect.

Figure 7: Visualization of HOLT-WINTERS test for STORE 25 sales

Figure 8: Result of HOLT-WINTERS test for STORE 25 sales

When five periods of STORE 25 sales data are smoothed using the HOLT-WINTERS exponential method, a downward trend is evident. The anticipated values start at 3,028,050.52 and successively drop to 2,949,111.42. This tendency is reflected in the upper and lower limits, which have values between 4,165,588.2 and 4,108,064.45 for the upper bound and 1,890,512.83 to 1,790,158.39 for the lower bound. This means that the sales forecast for Store 25 will continue to drop.

Figure 9: Visualization of HOLT-WINTERS test for STORE 20 sales

Figure 10: Result of HOLT-WINTERS test for STORE 20 sales

The sales data from STORE 20 were smoothed using the HOLT-WINTERS exponential projection for five periods. The predicted values show an upward trend over the specified periods, rising from 9,692,132.56 to 9,838,792.22. The forecast's upper and lower ranges are also climbing, with upper bounds falling between 12,274,556.54 and 12,428,330.21 and lower bounds between 7,109,708.57 and 7,249,254.23 in size. This implies a steady upward growth trajectory for the forecast's accuracy for sales at STORE 20.

Figure 11: Visualization of ARIMA test for STORE 25 sales

Figure 12: Visualization of ARIMA test for STORE 20 sales

Figure 13: Quality performance of ARIMA model for STORE 25 sales


The quality performance of the ARIMA model for STORE 25 sales is encouraging. The MAE (9,455.64) and MAPE (0.0034%) are low, indicating that the forecasts are correct. Moderate variability is shown by RMSE (29,901.35). The model outperforms a naive strategy, according to MASE (0.460). The model's appropriateness is supported by its AIC and BIC values of 73,748.40.

Figure 14: Quality performance of ARIMA model for STORE 20 sales

For STORE 20 sales, the quality performance of the ARIMA model is inconsistent. RMSE (86,950.12) denotes increased variability whereas MAE (27,496.04) and MAPE (0.0033%) suggest relatively accurate predictions. MASE (0.508) indicates that the model performs somewhat better than a naive strategy. A reasonable model fit is indicated by the AIC (78,652.94) and BIC (78,658.86).

Figure 15: Quality performance of HOLT-WINTERS test for STORE 25 sales

The performance of the HOLT-WINTERS model for STORE 25 sales contains flaws. A bias is evident from the Mean Error (ME) value of -37,486.18. Despite having moderate RMSE (580,387.03) and MAE (435,527.36) values, MAPE (15.47%) indicates significant percentage errors. The positive MASE (0.708) denotes relative improvement, while the negative ACF1 (-0.097) suggests that the predictive model may have been overfitted.

Figure 16: Quality performance of HOLT-WINTERS test for STORE 20 sales

The performance of the HOLT-WINTERS model for sales at STORE 20 shows limitations. The Mean Error (ME) of -152,449.83 shows that forecasts are biased. MAPE (13.54%) and MASE (0.731) point to accuracy issues while RMSE (1,317,587.47) and MAE (1,043,392.14) show substantial mistakes. The low ACF1 (-0.25) suggests that the prediction model may have been overfit.

Key Findings and Recommendations

Key Findings:

1. STORE 20 frequently outsells STORE 25, especially throughout the winter.

2. Holt-Winters forecasting works well for STORE 20 because of its ascending trend, but ARIMA works well for STORE 25 because of its declining pattern.


1. In order to capitalise on the increasing trend, Holt-Winters will be useful for STORE 20 sales estimates.

2. In order to consider into account its decreasing tendency, ARIMA will be used for STORE 25 sales predictions.

3. Strategic resource allocation will be advantageous to maximise sales for each shop based on its unique trends.


Precision and strategic planning are greatly improved by data-driven forecasting and decision-making. We visualised and examined sales trends for STORE 20 and STORE 25 using a variety of forecasting models, including ARIMA and HOLT-WINTERS. The findings offer guidance for developing tactics that can take advantage of the unique sales trends found in each location.



Read More

 MIS500 Foundations of Information Systems Report-3 Sample

Task Summary

This assessment task requires you to reflect on your experiences in MIS500 this trimester by following a four-step process to gain insights into the work you have done and how it relates to your own career and life more broadly. In doing so, you will need to produce a weekly journal to record your learning and then as the trimester comes to a close reflect on these experiences and submit a final reflection of 1500 words (+/- 10%) that will include the weekly journal as an appendices.


This is an individual assignment that tracks your growth as a student of Information Systems over the trimester. It is scaffolded around your weekly learning activities. Completing the activities and seeking input from your peers and the learning facilitator is essential for you to achieve a positive result in this subject. Before you start this assessment, be sure that you have completed the learning activities in all of the modules. This reflective report gives you the opportunity to communicate your understanding of how information systems relate to your career and future.

Task Instructions

1. During Module 1 – 5, you were ask to produce a weekly journal to record your learnings each week. Based on these weekly journals, please write a 1500 word reflective report about your experience focussing on how this will support developing and planning your future career.

2. You are required to follow the four steps of Kolb’s learning cycle when writing the reflective report.
You will keep a learning journal throughout the trimester. Each week as you complete the learning activities you record your experience in spreadsheet or word document.

A suggested format for the learning journal is as follows:

Table 1: Learning Journal

For each day in your learning journey, write the date and then the learning activity you engaged in. Detail what impact the learning had on you and then include any evidence you might like to keep for use later on. This journal should be appended to this assessment when you submit it.

Figure 1 – Kolb’s Learning Cycle



In this study, I have reflected on my learning experience related to MIS500 modules. I have described my learning experience based on Kolb’s learning cycle. For Assignment Help, This model explains that effective learning is a progressive process in which learners' knowledge develop based on the development of their understanding of a particular subject matter. Kolb's learning cycle has four phases, concrete learning, reflective observation, abstract conceptualisation and active experimentation. The learning process will help me to develop my career in the field of information technology.

Concrete Experience

Before the first module, I had little idea about the use of information systems in business. Thus, I was in the concrete experience stage of Kolb’s learning model, in which a learner has little idea about a concept. The first stage of Kolb’s model is concrete experience. In this stage, learners encounter new knowledge, concepts and ideas, or they reinterpret the ideas, concepts and knowledge that they already know (Hydrie et al., 2021). I learnt the use of information systems for making rational decisions in business is called business intelligence. I had no knowledge about business intelligence before the first module. Thus, it helped me to experience new knowledge about business intelligence. I started to think about how I can develop my career in the field of business career and the learning strategies that can help me to enhance my knowledge about the professional field.

The next modules helped me to deepen my understanding of business intelligence. I learnt that the emerging area of business intelligence is the result of a digital revolution across the world. Digital revolution refers to an increase in the number of users of tools and technologies of digital communication, such as smartphones and other types of computers, internet technology. The evidence for the digital revolution is the report "The Global State of Digital in October 2019.” The report mentions that there were around 155 billion unique users of mobile phones worldwide (Kemp, 2019). However, there were 479 billion internet users. The total number of social media users were 725 billion. The evidence has been obtained from module 1.2. Thus, there is high global penetration of digital technologies. The global penetration of digital technologies helped me to understand that I want to develop my career in the field of business intelligence. The digital revolution created thousands of career opportunities in the field. Business organisations need to use digital technologies to communicate with people or customers who use digital devices and technologies. Digital technologies are helping organisations in growth and international expansion (Hydrie et al., 2021). Businesses are expanding themselves with the help of digital technologies. Many businesses have transformed themselves from global to local with the help of digital technologies.

Reflective Observation

When I started to learn module 2, I learnt how business organisations use data to gain a competitive advantage over their competitors. In digital, the organisation which has relevant data can reach the targeted customers, improve their products and services and leverage different opportunities (Hydrie et al., 2021). Thus, data management and information management are vital for the success of an organisation. By collecting and managing data effectively, companies can obtain the knowledge that they require to achieve their business goals. I had reached the reflective observation stage by the time I learned this module because I started to reflect on new experiences by explaining why businesses need to digitise themselves. The reflection of observation is related to the second stage of Kolb’s model of learning. The reflective observation stage is related to the reflection on a new experience that a learner receives through his/her learning (Hydrie et al., 2021). It includes a comparison between the new knowledge or experience and existing knowledge or experience to identify a knowledge gap. This stage allowed me to know what I need to learn more to develop my career in the field of business intelligence or information system professional.

In the next modules, I tried to bridge the knowledge gap. In module 2.2, I learnt about the concepts of knowledge management and big data. Knowledge management is a collection of approaches that help to gather, share, create, use and manage related to management or information of an organisation (Arif et al., 2020). Knowledge management is crucial for organisations to gain meaningful insights from the collected data. However, big data refers to data in abundance which has high velocity and volume. Big data helps to identify important patterns related to events and processes and facilitates decision-making for business organisations.

These information systems are human resource information systems (HRIS), enterprise resource planning (ERP) systems and customer relationship management (CRM) systems Arif et al., 2020). This module played a vital role in shaping my knowledge by helping me to understand the practical use of information technology and information systems in business operations. I learnt how information systems help to describe and analyse the value chains of business organisations. A value chain of a business organisation is consist of main activities and supporting or auxiliary activities that help business organisations to carry out all their operations.
Module 3.1 also proved to bridge the knowledge gap. In this module, my knowledge reached to abstract conceptualisation stage of Kolb's learning model, which is related to the development of new ideas in a learner's mind or help him/her to modify existing ideas related to a concept. I started to use my learnings on how information systems can be used more effectively by business organisations. Thus, I tried to modify the knowledge related to existing applications of information systems in business.

Abstract Conceptualisation

Active conceptualisation is the stage of Kolb's learning cycle in which learners give a personal response to new experiences. In this stage, I started to think about how to use the new knowledge that I gained for the advancement of my career. I decided to learn more about ERP and CRM. If I learn about the two types of information systems, I can learn to develop them and help other organisations to learn their uses. It helped to shape my knowledge about area-specific information systems that can help organisations to meet the needs of the operations of their certain business operations. The two specific areas about which I gained knowledge in the module were ERP and CRM (Arif et al., 2020). ERP is an information system that helps to plan and manage the resources of a business organisation. It helps in managing to carry out supply chain operations. The main functions of an ERP are inventory planning and management, demand forecast and management of operations related to suppliers, wholesalers and retailers. However, CRM helps in managing relations with customers of business organisations (Hamdani & Susilawati, 2018). It helps to know and resolve customers’ grievances. CRM allows organisations to communicate with their customers to understand their needs and provide them with information about product and services offerings effectively. In module 4.2, I learnt how can help an organisation selects its business information system. The selection of a business information system by a business organisation depends on its business architecture (Hamdani & Susilawati, 2018). A business architecture refers to a holistic overview of operations, business capabilities, processes of value delivery and operational requirements. The information system that suits the business architecture of an organisation is most suitable for it. The price of information systems ascertained by vendors also influences the decisions of business organisations to use it.

Active Experiment

The active experiment is the stage in which learners decide what to do with the knowledge that they gained (Hydrie et al., 2021). I used the reflection technique to decide how to use my knowledge about information systems to develop my career. Harvard research in 2016 also explained the importance of reflection (Module 5.2). Reflecting on previous experiences helps individuals and organisations to recall what they learnt and find scopes of improvements in their existing skills and knowledge (Rigby, Sutherland & Noble, 2018). Thus, if business organisations reflect on their previous experiences related to IT operations, they can improve their knowledge about IT operations. Reflection can also help them to find a scope of improvements in their existing knowledge. As a result, they can improve their future IT strategies to achieve business goals. The improvements in the strategies can help them to ensure their future success. Thus, reflection can be an effective source of learning for organisations. The reflection of my learning helped me to know that I want to become a big data analyst because the requirements of big data analysis increasing in different fields, and I have effective knowledge about it. I will always follow ethics related to my profession to maintain professionalism because it is an ethical responsibility and professional conduct for IT professionals (McNamara et al., 2018).


In conclusion, my learning experience related to information systems helped me to know new concepts related to them. It helped me to bridge my knowledge gap about the use of information systems in business analytics. Based on my learning, I found that I have gained effective knowledge about big data analysis. Thus, I want to develop my career in the field of big data analysis.


Read More

MIS607 Cybersecurity Report Sample

Task Summary

Reflecting on your initial report (A2), the organisation has decided to continue to employ you for the next phase: risk analysis and development of the mitigation plan.

The organisation has become aware that the Australian Government (AG) has developed strict privacy requirements for business. The company wishes you to produce a brief summary of these based on real- world Australian government requirements (similar to how you used real-world information in A2 for the real-world attack).

These include the Australian Privacy Policies (APPs) especially the requirements on notifiable data breaches. PEP wants you to examine these requirements and advise them on their legal requirements. Also ensure that your threat list includes attacks on customer data breaches. The company wishes to know if the GDPR applies to them.

You need to include a brief discussion of the APP and GDPR and the relationship between them. This should show the main points.

Be careful not to use up word count discussing cybersecurity basics. This is not an exercise in summarising your class notes, and such material will not count towards marks. You can cover theory outside the classes.


Assessment 3 (A3) is a continuation of A2. You will start with the threat list from A2, although feel free to make changes to the threat list if it is not suitable for A3. You may need to include threats related to privacy concerns.

Beginning with the threat list:

• You need to align threats/vulnerabilities, as much as possible, with controls.

• Perform a risk analysis and determine controls to be employed.

• Combine the controls into a project of mitigation.

• Give advice on the need for ongoing cybersecurity, after your main mitigation steps.


• You must use the risk matrix approach covered in classes. Remember risk = likelihood x consequence. (Use the tables from Stallings and Brown and remember to reference them in the caption.)

• You should show evidence of gathering data on likelihood, and consequence, for each threat identified. You should briefly explain how this was done.

• At least one of the risks must be so trivial and/or expensive to control that you decide not to use it (in other words, in this case, accept the risk). At least one of the risks, but obviously not all.

• Provide cost estimates for the controls, including policy or training controls. You can make up these values but try to justify at least one of the costs (if possible, use links to justify costs).

Reference Requirement

A3 requires at least 5 references (but as many as you like above this number) with at least 3 references coming from peer-reviewed sources: conferences or journals. (Please put a star “*” after these in the reference section to highlight which are peer reviewed.)

One of the peer-reviewed articles must be uploaded in pdf format along with the A3 report (this can be done in BB). This pdf will be referred to here as the “nominated article”. (Zero marks for referencing if the nominated article is not itself peer-reviewed.) Of course, the nominated article should be properly referenced and cited, but you need to site an important direct quote from within the article (with page number), not just a brief sentence from the abstract. The quote should also relate to the main topic of the article, not just a side issue.



Cyber security threat is one of the important steps or crucial steps within the organization to make the whole information secure than previous. For Assignment help, Cyber threats are giving a huge impact on various types of businesses and tools which are getting resolved. A threat security plan will be prepared for one packaging company named PEP for describing the attack on JBS food.

PEP management wants a safeguard system to mitigate the JBS food attack. A cybersecurity specialist will be required to identify all threats and vulnerabilities regarding the intruders' attack. Here different cybersecurity factors will be described elaborately. All threats and vulnerabilities reports will be mentioned in this report. A STRIDE methodology is very much important to understand the different types of cyber threats within the organization.

PEP will implement the STRIDE methodology for resolving the issues of different types of cyberattacks within the organization. It can also create concrete growth in the organization.

Body of the report:

Discussion of App and GDPR:

APP: The privacy act is recognized as one of the useful principles within the Australian Legislation. There are mainly 13 principles presented here to secure the information of an organization. Few rules and purposes of the organization have been incorporated in this section.

Principal name

Australian Privacy policy 1 is a open communication systems among the management and team. This privacy act can help to make transparent communication within the hierarchical team. It can produce a clear APP policy.

APP 2: Anonymity and pseudonymity. APP entities are required to identify the pseudonym. Here a limited exception has been applied.

APP3: Gathering all personal data and information. All personal information is sensitive so it is very important to handle that information gently.

APP4: Dealing with all unsolicited information. In that case all personal information of the users which are not solicited deal with a proper

APP 5: Notification for personal information. Here all the circumstances have been described for gathering all required personal information.

APP 6: Disclosing all personal information. APP entities can be used for disclosing all personal information to meet all certain requirements.

APP 7: Direct marketing is one of the useful strategies for improving certain conditions.

APP 8: Cross organization culture for understanding the personal information. APP entity is very much important to protect all required personal information

APP 9: Adoption and disclose of government based identifiers. Limited circumstances are very much important for adopting the Government related identifier.

App 10: Personal information gathering system should be more smooth and accurate for collecting all essential information. Quality of personal information

APP 11: Security of all essential information. APP privacy policy should take some necessary steps to restrict any misuse of information, unauthorized access. The entity has enough rights to destroy the obligation.

APP 12: Accessing personal information. APP entity obligation is very much important to get access to any personal information

APP 13: Error correction of all essential information. Personal information should be corrected for maintaining the obligation.


THE GDPR rule is mainly based on the UK. There are a few factors that are highly responsive to creating an effective cybersecurity policy for restricting any upcoming threats in the future from the side of the UK. There are mainly seven key factors that are responsible to make the start-up organization secure and help them to grow in the future. a). Lawful, fairness and transparency, purpose limitation, the accuracy of the information, prop[er information regarding the storage, Accountability. This gdpr information helps to cover up the Australian Privacy policy. Not only that but also it can create a huge impact on the PEP organization's growth. It can secure the future of GDPR privacy acts.JBS food facility service is recognized as one of the important packaging canters all over the world which has created a huge impact on organizational growth.

Threat lists and STRIDE categorization:

Cyber threats can become up with defining the different types of factors that can create a significant impact to grow the business sustainably. Here in this report, a threat modelling process has been organized for improving the security control system. In this report, the STRIDE model has been introduced to mitigate all potential vulnerabilities within the system. There are mainly six threat categorization techniques that are going to be introduced which can significantly impact the growth of the business model of PEP. There are mainly 7 types of cyber threats that have been considered here named as Malware, Denial of Service, Phishing technique, and SQL injection. Nuclear deterrence is viewed so positively that cyber-deterrence is frequently suggested as a promising analogous next step (Hellman,2017, 52-52).

1. Ransomware:

According to the detailed analysis, Ransomware attacks or malware attacks hold all infected files from IT software systems which can be easily paid for by hackers. The ransomware track also defines the concept of security breach policy.(Jones-Foster, 2014)[ The risk of PHI theft is growing as the nation’s health care system moves toward a value-based care model, which promotes more robust use of electronic health records and improved information technology integration across the continuum of care. "The sophistication and creativity of hackers today is pretty scary,” says Michael Archuleta, HIPAA security officer at 25-bed Mt. San Rafael Hospital, Trinidad, Colo. "You really have to be on your toes and pay attention, because viruses, malware and computer security threats change almost daily.] Malicious websites. Infected websites and phishing emails are recognized as an important factor for stealing all information of the customers (Ekelund&Iskoujina, 2019). Ransomware attacks have enough capability to stop any essential operation with any start-up organization. PEP is recognized as one of the start-up stores to execute its products within the market(Cox, 2008).

2. DDoS attack:

Distributed denial service attack is also recognized as another branch for all cyber hackers. Cybercriminals have enough potential to stop access from the users. Attackers are constantly trying to generate the spoof of the IP address technique. Attackers are producing a lot of information to all the victims for creating extensive connections outside the servers' end (Banham, 2017).[To fund Phase 3, the Defense Department's Defense Advanced Research Projects Agency (DARPA) just awarded almost $9 million lo BBN. Key priorities involve work on DTN scalability and robustness to support thousands of nodes, and designing and implementing new algorithms or several key tasks]

3. Social Attack: In that case, attackers are trying to build up a log file for accessing or stealing important information from the side of users. Vulnerable and intruder attacks have enough priority for installing the malware function within the system device. Here Phishing technique is recognized as one of the important tools to steal various information (Benzel, 2021, 26). Cyber attackers are always trying to provide some email for accessing all required login credentials (Cox, 2008).[ s. Social engineering, where hackers manipulate employees to gain access to passwords and systems, is one of the main risks organizations face. Therefore, encouraging employees to be vigilant regarding company information is very important.]

4. SQL injection: This is determined as another type of cyber threat where cyber-attack is established by inserting the malicious codes in SQL. When the server has become infected, it release all necessary information. The malicious codecan steal all necessary information from the users (Bertino& Sandhu, 2005).

5. Emotet: CISA described the concept of Emotet in an advance manner. Emotet is also recognised as one of the costly and destructive malware within the system.


The STRIDE model is recognized as one of the useful systems where it can secure the app into three different categories named Spoofing, Tampering, Repudiation, Information disclosure, DDOS, and elevation privileges.


Spoofing: This technique can help to enter those people who are authenticated to access all required information as per the company’s standard.

Tampering: Integrity is the best policy to modify all network issues. It can also cover up the data on disk, , memory, and networks. This is a useful technique to take responsible action.

Information disclosure technique: This can help to provide all information that is not so much authorized or end to end encrypted

DdoS: This DDoS service has defined the concept of denying all access to the required resources which can make the service more immense.

Elevation of privilege: The proper authorization has been neglected to give access to other users. It can damage the overall infrastructure of Peter Excellent Packers.

Threat analysis:

Threat factors are getting measured here with the help of multiple risk factors within the organization. Multiple threads can arise here to improve the cybersecurity risk within the organization. All cyber threat factors are enlisted within the table.“While cyber-warfare might use some precepts of kinetic warfare, others had little significance in cyberspace” (Lilienthal & Ahmad, 2015).

Cyber Threats:

Hacking Password:

Cybersecurity threats are recognized as one of the important factors for analysing the priority of different risks, DDoS attacks and malware attacks. Ransomware is highly responsible to steal all the user's transaction history from the transactional database.

DDoS attack: Analyzing the severity of the risk, it is determined as one of the important and medium risk factors for stealing all required information from the customer table. According to the Risk factor analysis, the severity of individual risk factors creates a huge impact on organizational growth. The scaling technique is quite helpful to measure the severity of cyber attacks within the organization.

The Social attack: This attack has been considered a high priority and high level of consequences. Phishing attacks are also recognized as severe risk factors.ll intruders are trying to send some ransom mails for creating a log file within the organization's system. It can also become helpful to steal all necessary information from the users. Customers are always trying to open the mail which is coming from the PEP organization. It can directly impact the psyche of all potential and existing customers.

The weak password policy: Cloud-based service has been hacked with the help of a weak password system. A weak password policy can become more helpful to lose all sensitive information and personal information from the existing data sets or policy. These password policies can be overcome by creating a strong suggestion of the password.

Risk Register matrix:


Figure 4:Risk matrix
(Source: Stallings & Brown, 2018)

According to the Risk register matrix, the priority of all risk factors can be stated below:

1. Social attack
2.DDoS attack
3. hacking password attack
4. Weak password policy.

Threat controls:

According to the whole market analysis, it is very important to resolve all cyber threat factors in order to mitigate any issues within the organization. Phishing technique is recognized as one of the high threats which creates log files within the main file. It creates a wide range of opportunities within organizational growth. There are several factors that are highly responsible for mitigating all upcoming threats within the organizations. According to the severity of this act, a huge number of methods are responsible to mitigate such issues. These control measures will be updated with proposing the actual budget in the market.
The whole threat resolution process will be discussed here by identifying some threats within the new start up organization named as PEP. When these methods are applied in IT security infrastructure, it can enhance organizational growth.

Figure 5: Threat controls
(Source: Banham, 2017)

Proper knowledge of IT assets:

BYOT, Third party components are recognised as main service for all employees within the organization.

Supervisor of IT infrastructure should be more aware about different types of vulnerabilities. The minimum cost estimation for managing whole IT assets are $50,000.
Strong protocol of IT security:

.Security within IT devices must be extended by the help of BYOT. All the transactional information or databases must be updated on a regular basis. Strong security protocol is very much necessary for improving the internal and external environment. Employees cost:$20,000 (McLaughlin, 2011)

Real time visibility: Therefore the team can become alert to avoid such issues from the grassroot level. the organizational control can enhance the growth of such organizations.

A QA analysis team must be incorporated in this section for improving the organizational growth. The whole system requires $10,000 maintenance charges.

Continuous, Actionable and Adaptive risk:

According to the risk severity, the management team should give some resolution structure for identifying threads in a prior manner.

Team should be more focused to mitigate all issues from the grassroot level.Technological advancement should be checked on a regular basis for identifying all vulnerabilities before getting into the system. The most important risk security control requires:$10,000.

These are main thread control measures to identify all cyber security threats. It is very important to incorporate such a strategy within organizational growth for reducing all upcoming threads. It can also produce a better visibility about which risk resolution technique is necessary to mitigate the issue.
Mitigation scheme:

Cyber security risk mitigation scheme is recognized as one of the important factors to reduce all security policies and produce a huge impact on cyberthreats. Risk mitigation schemes separate or segregate three different elements named prevention, detection, and remedies. Cyber security risk factors can be mitigated by six different strategies which will be mentioned below in a sequential manner.

Improving the network access control criteria: A proper network access control needs to be established for mitigating all inside threats. Many organizations are trying to improve the security system efficiently. This factor can minimize the impact of likelihood and consequences. All the connected devices with the IT management system can increase the endpoint security within the system.

Firewall protection and antivirus: Cybersecurity risk can be measured by implementing the methods like firewall and antivirus software within the system. These technological factors are providing some exceptional security to restrict all intruders within the system. Outgoing traffic is also getting stopped with the help of such firewall security systems(Stiawan et al., 2017).

Antivirus software is also very useful to identify any malicious threats which can create significant damage within the organization.

Monitoring Network Traffic: A proactive action is very much important to mitigate all cybersecurity risk factors. Continuous traffic is necessary for improving the cybersecurity posture. A comprehensive view of the IT ecosystem can boost up organizational growth. This can enhance the IT security system. Continuous traffic helps to analyse or identify all-new threats and increases the minimal path of remediation.
Building a response plan:

PEP organizations must ensure that IT security teams and non-technical employees are highly responsible for any kind of data breach within the organization.

An incident response plan is determined as one of the useful techniques to mitigate cyber risk for improving the network environment. The incident response plan is recognized as one of the important strategies for preparing a team to mitigate an existing issue. Security Rating is also determined as one of the important strategies for getting feedback regarding implementing control measures.


In this report, cybersecurity threat factors were discussed in a very detailed analysis. On the other hand, different types of measures will be elaborate to reduce the cyber threats factors. PEP company has been taken here to identify all future threats within the organization and resolution factors to remove these threats from the grassroots level. A risk matrix was given here to identify the severity of such a risk factor. According to the risk scale analysis, few resolutions were described here to mitigate all cyber threats. Different techniques with a cost estimate budget for implementing those techniques were discussed elaborately. It can enhance the growth of such an organization.



Read More

MIS609 Data Management and Analytics Report 3 Sample

Task Summary

In groups, apply data analysis and visualisation skills, and create meaningful data visualisations for secondary data (selected after approval of your facilitator). Then, create a 5-10-minute presentation as a short video collage explaining the findings of your visual analysis.

Task Instructions

Step 1: Group formation and registration

Form groups of two members each. Please refer to the attached Group Formation Guidelines document for more information about group formation and registration. You may seek help from your learning facilitator.

Step 2: Select secondary data from the internet. Make sure you select the data after approval of your learning facilitator.

Step 3: Find out the issues that the selected data has. Make note of all the identified issues. For every issue identified you should have a solid rationale.

Step 4: Select a data analysis / visualisation tool. You can surf the web and find out some free data analysis / visualisation tools. Some of the recommended tools are Tableau, Microsoft Excel, Microsoft Power BI, Qlik and Google Data Studio. Make sure that before you select a tool, you carefully understand the merits and demerits of that tool. Also discuss with your facilitator the choice of your tool.

Step 5: Analyse selected data using the selected tool and try creating meaningful visualisations that give you more visual insight into data.

Step 6: Based on analysis using visualisation, identify important finding about that data.

Step 7: Carefully identify your limitations.

Step 8: Now create a Microsoft PowerPoint presentation having the following sections:

Section1: Selected Dataset

- Provide a link to data.
- Explain why you select this data.
- Explain the issues in the selected data.

Section 2: Selected Analysis/Visualisation Tool

- Explain why you selected this tool.

- What were the problems that you faced with this tool?

- What are the benefits and drawbacks of the selected tool?

Section 3: Visualisations (Diagrams)

- On every PowerPoint slide, copy a diagram (visualisation) and beneath every diagram briefly explain what information/knowledge you obtained from the diagram. Make as many PowerPoint slides as you want to.

Section 4: Findings and Limitations

- Explain what your findings are about the selected data as a result of data analysis (/ visualisation) that you have performed.

- Enlist your limitations.

Section 5: Group Work Distribution

- Categorically explain how work was distributed between group members.

Step 9: Now using the PowerPoint, create a video collage in which your facilitator should be able to see you as well as the PowerPoint slides in the video. Please note that this 5-10-minute video is like an online presentation. Both members of the group should take part in the video equally. Please ensure that you are objective in your presentation (PowerPoint and video). Plan and rehearse what you have to speak in the video before creating it.




The aim of the report is to found issues within the aspects of sustainability and Business ethics. It is focused on demonstrating the learnings of the field by analysing and providing recommendations in reference to real life cases from an organisation. For Assignment Help, The topic which is being researched in the current report is new shopping experience unveiled by Zara.


It is an important topic because the aspect of sustainability has become very important in customer journey. Customers are only likely to incline their purchasing behaviour in favour of those brands who are making efforts to make sure that their business operations are not harmful to environment and community (Pei et al., 2020). Zara has integrated sustainability in its customer journey at new store in Plaza de Lugo store slated to reopen in A Coruña in March. It will unveil new concepts, materials, and designs, which will establish a benchmark for the brand.

Major Issue and Outline of Methodology

The major issue addressed in the report is that of Fast Fashion. It can be referred to as trendy and cheap clothing that are representative of the celebrity culture. Zara is one of the most prominent dealers in fast fashion. The brand is looking forward to make a change by presenting a experience of sustainable clothing.

In the current report, data would be collected through secondary sources. It will allow, the researcher, to integrate viewpoints of other individuals in reference to sustainable clothing and customer journey and its relationship with business ethics. The viewpoints would be further studied in reference to Zara in the later stages of the report for identifying the correlation and providing recommendations in order to deal with the situation.

Contribution of Assignment

This report will contribute to the literature related to sustainable customer journey and clotting and how it impacts on business ethics and sustainability of the company. It will also provide recommendations to the manager of Zara regarding how they could deal with the issues of fast fashion and secure the overall sustainability of the business.

Literature Review

Concept of Sustainable Shopping experience

According to Ijaz and Rhee (2018) shopping experiences and sustainability are the major elements which are affecting how their customers are shopping now and will continue to do so in the future. They have led to considerable changes in the Global retail landscape which would inevitably impact and shape that future retail environment.

Han et al. (2019) stated that in order to attract Shoppers to the physical retail space if it is necessary to provide them with sustainability and aesthetic. This is so because they are likely to be attracted by a spacer where they could confront a wide variety of reactions, experiences, and emotions.

In the perception of Lin et al. (2020) the importance of a light, texture, sound and smell has taken the centre stage where the store designers are combining subconscious and sensor elements in order to generate experiences and memories which are not only visual but also atmospheric.

However, De et al. (2021) argued that stores in future are likely to merge with online retail environments rather than competing. It makes it more important for current retailers to improve their shopping experience when it comes to dominating the online space. The physical store is likely to become a space aware retailer and brands will be able to express their personality to the customers.

As per the views of Geiger and Keller (2018) personality of the brand could be reflected through the showroom where they would provide engaging experience in order to encourage Shoppers to purchase their products online after they have touched and tried that in the shop. It could be said that a sustainable shopping experience revolves around making shopping centres of the future engagement centres. Retailers such as Dara would need to focus on how to take the shopkeeper on an improved and sustainable customer journey.

Relationship Between Sustainable Shopping Experience and Customer Journey

Signori et al. (2019) highlighted that sustainability in both environmental and social aspects is one of the most defining Trends of retail evolution. It is becoming Paramount as the customers are taking a long-term shift to an eco-friendlier environment and adopting similar shopping behaviour. Consumers are already asking brands about what they are doing to integrate sustainability in their business operations.

From the research conducted by Lin et al. (2020) it has been identified that the trend of sustainable shopping is very strong among gen Z and millennial consumers. This is so because they belong to a younger shopper segment and tend to identify themselves with sustainable values as compared to older generation Shoppers.
Witell et al. (2020) explained that sustainable shopping is not just about the brand. Product packaging and store design are an integral part and one of the most important aspects of providing a sustainable shopping experience to the customers. Adivar et al. (2019) contradicted that customers are not asking for environmental sustainability but they are also concerned about the impact of the company's operations across the entire supply chain. They want to get information about ethical components Sourcing to consumption of water and management of pollution.

However, Holmlund et al. (2020) argued that Shoppers are more concerned about product packaging and have been expecting brands and retailers to invest more in sustainable alternatives. This is an important aspect of a customer journey because the packaging communicates brand tone when the customer opens the product.

In the views of Cervellon and Vigreux (2018) if the brand does not have a recyclable packaging or then it is highly unlikely that the customer would make another purchase. This is so because they feel that when they open the product packaging goes to waste and if it's non-recyclable then it is just to contribute to the pile of waste.

Literature Gap

In the current literature, a gap has been identified in the impact of Sustainable shopping experience on customer journey and their viewpoints. It is an important element because even though there is a relationship between this component, they exist independently in a retail environment. Brands such as Zara are making a conscious effort to provide a sustainable shopping experience to the customers but are still looking for answers on how it improves the customer journey and make them want to spend more time in the store and incline their Purchase Decision in favour of the business organisation. Impact of Sustainable shopping experience on customer journeys needs to be explored so as to gain clarity on the particular aspects which could be integrated with the business organisation for improving the customer journey while exercising functions in a sustainable manner.


Within the current report, data has been collected from secondary sources. Qualitative information has been collected. It is useful in the present context because the researcher aims to explore business sustainability in terms of Zara by reflecting upon its case study. In order to add credibility and reliability into the study data from both secondary sources as well as a real-life organisation has been integrated. Database library which has been surfed for collecting secondary data is Google scholar. This is so because it makes it easier for the individual to search for published books and journals based upon keywords (Johhson and Sylvia, 2018).

The researcher has made sure that only those books and journals which have been published in 2017 for after that have been integrated. This is so because this data is comparatively newer as it is only 4 years old at maximum. This allows the learner to reflect upon the latest perspectives in reference to sustainable shopping experience and customer journey. By doing so, the individual would be able to curate an informed set of findings.

Case Study Findings and Analysis

Overview of the Organization

Zara is a Spanish apparel retailer. It was founded in 1974 and is currently headquartered in the municipality of Arteixo, Spain (Zara, 021). Zara is one of the biggest International fashion companies where the customer is the core element of the business model. It has an extensive retail network which includes production, distribution, design, and sales. It tends to work closely and together as a single company globally with its parent organisation — Inditex, Zara has been able to focus on the key elements of production. The business operations of the organisation are based on three major pillars which are digital integration, sustainability, and flexibility. It has been able to bring its customers closer than ever to the products and provide them at affordable prices.

The success of Zara was followed by international expansion at the end of 1980s and the successive launch of new brands within the same parent organisation which now have an integrated model of physical stores as well as online business operations (Inditex, 2021). The human resource at Zara is focused on fulfilling the demands of the customers. This is so because it is focused on creation of value which is beyond profit by placing the concerns of the environment and people at the core of its decision-making capabilities. Zara is focused on doing better and being better in reference to do business operations while securing sustainability.

Critical Evaluation of the Issue – Sustainable Shopping Experience at Zara

Zara Home is focused on unveiling its new Global image. It's new Store, The Plaza de Lugo will reopen in March with a totally overhauled concept. The store has been reported to have new designs and materials which would establish a global benchmark for the brand. This is so because the new concert revolves around being 100% ecological (Inditex, 2021). The story would be featuring minimalistic designs with traditional routes along with the latest technologies which would contribute to the shopping experience of the buyer.

The construction materials of the store have been made with the help of local artisans and include lime and marble with linen and Silk. It is in contrast with the furniture which is made from traditional materials such as oak, slate and Granite. It has been identified that this environmentally friendly store has used those materials which are capable of absorbing carbon dioxide. It only displays traditional handcrafted pieces on Handlooms which have been burnt by a novel warm and comfortable lighting. The energy consumption of the store has been enabled through sustainable technology and focused towards making sure that it is not harming the environment in any manner with monitored use of electricity. The idea of this store is to provide a new shopping experience to the customers.

Within this, the product displayed tends to stand out in a space which feels familiar like home thus, is in Alliance with the brand image of Zara home. It has been done by recreating a mixture of aesthetic beauty and feelings of well-being and comfort. The results of the Sustainable shopping experience curated by Zara would be on display for its flagship store which reopened in March 2021 as it was under a process of a full renovation and overhaul (Inditex, 2021). It could be stated that the new Zara home store concept enables the customers to uniquely experience the products and Discover its collections in a better way. The idea behind the design was to create an enjoyable visit for the customers to a warmer which focuses on sustainability and comfort by integrating the aspects of beauty and Technology together.


By analysing the contents of the report, following recommendations would be made for Zara in order to improve its sustainable shopping experience and ultimately enhance customer journey:

Using recycled and biodegradable packaging: it is suggested that Zara should make efforts to reduce the amount of plastic packaging which is used in its products. Biodegradable packaging which is made from plant-based materials such as cellulose and starch could be used which is broken down in a manner that could be made into bags. It is necessary to reassess how the organisation uses its packaging and where it can reduce the negative impact on the environment.

Minimising use of paper: it is necessary to reduce the amount of paper which is used in the organisation in order to drive sustainability. Zara needs to identify tasks and processes that require pen and paper to perform and then digitise them. For example, providing the bill and invoice to the customers requires the use of paper and ink which could be digitised and sent directly to the phone number or email address. It will make it easier for both the organisation as well as the customers to access the invoice if it is available in digital form because people are likely to misplace paper slips.

Empowering customers for engaging in sustainable operations: it has been identified that when people want to be more sustainable, they are likely to make sustainable purchases in decisions in order to leave their mark. By helping the customers to offset their impact in reference to retail habits would be highly beneficial for Zara's own sustainability efforts. It would need to make the people feel empowered as consumers and motivate them to bring changes in their daily habits. It also provides them with the confidence that Zara is out there to make a difference in the long term.


Findings and Importance to Business

From the current report, it has been found that sustainable shopping experience is gaining importance in the current environment. This is so because customers are inclined towards making purchasing decisions in favour of those business entities who integrate sustainable aspects in their operations. It is important to Zara because it holds a negative reputation of engaging into fast fashion and not performing sustainable operations. However, by integrating aspects of the Sustainable shopping experience it would be able to improve its business model and brand image in both short-term and long-term which will further help the organisation to increase its sales and be up-to-date with the current trends in the market.

Risks for Implementation

The major risks for implementing the recommendations is that it would need to make changes in a business model on an international level. For example, in order to introduce biodegradable packaging, it would need to make changes in all the stores and warehouses which satisfy both offline and online demand in order to make sure that the change has been implemented. It is risky because even though the customers would be in favour of biodegradable packaging it is unclear on how it will actually solve the issue.

In addition to this, by minimising the uses of paper it can get difficult for the customers to adhere to the change. Since always Zara has provided invoices and builds on paper and when it would turn digital it does not know how the customers would be able to absorb the change and be in favour of it. One of the major reasons behind the same is that our customers are sceptical in giving their personal details such as a phone number and email address while making a purchase as it makes them susceptible to phishing scams.

Limitations of the Study

The limitations of the current study are stated below:

- The researcher has only used the secondary data. This means that the findings of the study have been developed on the opinions and perspective of other authors and researchers.

- Limited number of studies are integrated in the report which reduces the reliability of the conclusion. The major reason behind the same is that a small sample size interferes with the level of generalization because the researcher generalisation specific content on the basis of the opinions in findings of a small number of people.


Read More

MIS603 Microservices Architecture Report 2 Sample

Assessment Task

This proposal should be approximately 1500 words (+/- 10%) excluding cover page, references and appendix. This proposal must be typed and clearly set out (presented professionally). You need to pay special attention to principles and key concepts of Microservices Architecture (MSA), service design and DevOps. The purpose of this assessment is to give you an opportunity to contextualise and convey to learning facilitator relevant knowledge based on “real-world” application. Particularly, the aim of this assessment is to enhance your employability skills through providing hands-on education and opportunities to practice real life experience. As a result, this assessment item is developed not only to evaluate your understanding of MSA application, but also to assist you practicing and improving your research skills. In doing so, this assessment will formatively develops the knowledge required for you to complete Assessment 3 successfully.


MSA have been getting more and more popular over the last year, and several organisations are migrating monolithic applications to MSA. A MSA consists of a collection of small, autonomous services that each service is a separate codebase, which can be managed by a small development team. A team can update an existing service without rebuilding and redeploying the entire application.

Services are responsible for persisting their own data or external state. This differs from the traditional model, where a separate data layer handles data persistence. More recently, with the development of cloud computing, new ways of software development have evolved with MSA recognised as a cloud-native software development approach. As a professional, your role will require that you understand the principles of software development, especially in the field of cloud-based platforms, which are rapidly becoming the preferred hosting solution for many organisations. Having a working understanding of these concepts will enable you to fulfil many roles and functions, and be informed as to what factors influence decision making when software development architecture has been selected. Whilst you may not be a developer, it will enable you to have meaningful conversations about the principles of MSA and why certain decisions may be made in a certain way. This will support you to manage the bridge between IT and the business.


You are expected to address the following steps to fulfil this assessment task:

1. From the list below, select an organisation that you are familiar with and / or have enough data and information. Here are list of organisations using microservices:

o Comcast Cable
o Uber
o Hailo
o Netflix
o Zalando
o Amazon
o Twitter
o PayPal
o Ebay
o Sound Cloud
o Groupon
o Gilt

2. Discuss how MSA has transformed or revolutionised the operations of the organisation. Identify and analyse at least three business or organisational reasons for justifying your discussion.

3. Develop a business proposal to introduce the selected organisation, justify why you choose it and why microroservices is the best architecture in the selected organisation.

The report should consist of the following structure:

A title page with subject code and name, assignment title, student’s name, student number, and lecturer’s name.

The introduction (200–250 words) that will also serve as your statement of purpose for the proposal— this means that you will tell the reader what you are going to cover in your proposal. You will need to inform the reader of:

a) Your area of research and its context

b) The key elements you will be addressing

c) What the reader can expect to find in the body of the report.

The body of the report (1000–1100 words) you are required to research and write a proposal focused on organisation using MSA as a software development philosophy. However, you are strongly advised to do some research regarding MSA in a “real-world” application.

The conclusion (200–250 words) will summarise any findings or recommendations that the report puts forward regarding the concepts covered in the report.

Format of the report

The report should use font Arial or Calibri 11 point, be line spaced at 1.5 for ease of reading, and have page numbers on the bottom of each page. If diagrams or tables are used, due attention should be given to pagination to avoid loss of meaning and continuity by unnecessarily splitting information over two pages. Diagrams must carry the appropriate captioning.



Microservices architecture has started gaining popularity in the past few years as displayed with the migration of the monolithic applications towards its adoption. It stands to encompass a collection of small and autonomous services wherein each of the service consists of a separate codebase that possesses the feature of being managed by a compact development team. For Assignment Help, A unique benefit of the element corresponds with the fact that the development team can update the existing service in the absence of the need of rebuilding or redeploying of the entire application (Li et al., 2020).With the recent advancements in the aspects of cloud computing, and the new and innovative manner of software development, microservices architecture has acquired recognition as a cloud-native method for software development (Aderaldo et al., 2017).

The report is concerned with the discussion of the manner in which microservices architecture has led to substantial transformation and revolutionization of the operations of the selected company, Uber. In the same context, it identifies and analyses the organisations reasons for the justification of the shift of Uber from the monolithic architecture to the microservice system by assessing the issues that led Uber to make the decision. As such, the report intends to develop a business proposal that would be introduced to the selected organisation in line with the recognition of microservices as the best architecture by laying an emphasis on the benefits that it brings to Uber.


An adoption of the aspect of microservices architecture is becoming a popular solution to large and complex issues in the context of IT systems owing to the fact that is entails a technique for the development of applications in the form of small services wherein each of the service is tasked with one function akin to product search, shipment, payment. Such services communicate with one another with the application of API gateways. A number of international companies such as Amazon or Netflix has displayed a transition from monolith to microservices, thus, clarifying its relevance and usability. Similar to many start-ups, the company Uber initiated its operations in line with monolithic architecture in order to cater to a single offering in one particular location. The choice had been justified at that tome with all the operations transpiring under the UberBLACK option based in San Francisco (Haddad, 2019). The presence of a single codebase had proven to be sufficient for the resolution of the business concerns for the company, which remained inclusive of connecting drivers with the riders, billing as well as payments. As such, it was reasonable to confine the business logic of the company in one particular location. Nevertheless, with the rapid expansion of Uber into a greater number of citizens, accompanied by the introduction of new products, such a nature of operations needed to undergo considerable variations or a complete change.

Consequently, the growth the core domain models and the introduction of new features led to an outcome wherein the components of the company became tightly coupled, thus, resulting in an enforcement of the encapsulation which increased the difficulty of separation. Furthermore, the continual integration proved to be a liability owing to the fact that the deployment of the codebase corresponded with the deployment of all the services at once. This meant a greater amount of pressure for the engineering team who had to handle more requests as well as the significant increase in the developer activity, irrespective of the rapid growth and scaling of the said engineering team (Haddad, 2019). Moreover, the continual need for the addition of new features, resolution of bugs and fixing the technical debt within a single repo developed as significant challenges. In the same context, the expansion of the system, on the basis of a monolithic architecture, resulted in issues for the company in line with scalability as well as persisting integration (Torvekar & Game, 2019).

Additionally, a single change in the system became a huge responsibility for the developers in view of the dependencies of the components of the app of Uber.The monolithic structure of Uber entailed a connection between the passengers and drivers with the aid of the REST API (Hillpot, 2021). It also possessed three adapters that contained embedded API to serve the aforementioned purposes of pilling, payment and text messages. Furthermore, the system involved a MySQL database with all the features being contained in the monolith (Hillpot, 2021). In other words, the primary reasons for the decision of Uber to transition to microservices architecture corresponded with the following factors (Gluck, 2020):

- The deployments were expensive and time consuming, in addition to necessitating frequent rollbacks.

- The maintenance of good separations of concerns in relation to the huge codebase proved challenging since expediency, in an exponential growth environment, results in poor boundaries between components and logic.

- The combined issues provided challenges for the execution of the system in an autonomous or independent manner.

As such, Uber opted for the hypergrowth of other successful companies such as Amazon, Netflix or Twitter, with the aim of breaking the monolith into multiple codebases such that a service oriented architecture could be developed. To be specific, the company opted for the adoption of microservices architecture. The migration from the monolithic codebase to microservices architecture enabled the company to resolve a number of concerns. Accordingly, the introduction of the new architecture was accompanied by the introduction of an API gateway, as well as independent services, each of which possessed individuals functions and the feature of being deployed and scaled in a distinct manner (Kwiecien, 2019). The adoption of the microservices architecture led to the increase of the overall system reliability and facilitated the separation of concerns by the establishment of more distinctly defined roles of each of the components. It also highlights the ownership of the code and enables autonomous execution of the services (Gluck, 2020).

The implementation of microservices architecture also allows for developer velocity wherein the teams retain the capability of deploying the codes in an independent manner at their own pace (Gluck, 2020), thus, improving productivity. Owing to the fact that the software developed as microservices is fragmented in smaller and independent services, in which each component can be written in its own unique language, aided Uber in the development process, formulating continual delivery and in the acceleration of the growth of the business (Rud, 2020). Consequently, the transition of Uber from a monolithic system to microservices architecture augmented the speed, quality and manageability of the development of the software and the reliability with regards to the factor of fault tolerance, while allowing the teams to focus on just the services that required scaling, thus, speeding up the process (Sundar, 2020). Finally, a few real world applications of microservices architecture for Uber corresponded with the processing and maintenance of the profile data of the customers, handling the different types of available rides on the basis of the location, mapping of the location of the customer and that of the nearly rides on a custom map, formation of a set of potential rides with respect to a specific ride, and computation of the price of the ride in a dynamic manner.


Microservices services remain responsible for the persistence of their external state or personal data. As such, the primary difference from a traditional model is related to the prevalence of a separate layer of data that manages the persistence of data. An understanding of the concepts enable the fulfilment of varied roles and functions, in addition to gaining knowledge on the factors that influence decision-making in the instance of selection of the software development architecture. A number of international companies such as Amazon, Netflix or Coca Cola have opted for a transformation in terms of their IT infrastructure for the implementation of microservice architecture. The process is also accompanied by the rebuilding of the internal organisational structure to obtain a competitive edge.

It is significant to comprehend the principles associated with the said software development, particularly with respect to cloud-based platforms that have found increasing application as preferred hosting solutions for a number of organisations.The transition to microservices architecture has proven to bring with it significant benefits for Uber with respect to the process of development, scaling and independent deployment of each of the microservice. In the same context, it also allows the company to cut back on undesired expenses, while encouraging innovation as well. It has also highlighted the fact that microservices architecture possesses a strong reliance on people and processes within the context of an organisation owing to its involvement with technology, in view of the fact that single microservices stand to be maintained by independent and specialised teams. 


Read More

BDA601 Big Data and Analytics Report Sample

Task Summary

Customer churn, also known as customer attrition, refers to the movement of customers from one service provider to another. It is well known that attracting new customers costs significantly more than retaining existing customers. Additionally, long-term customers are found to be less costly to serve and less sensitive to competitors’ marketing activities. Thus, predicting customer churn is valuable to telecommunication industries, utility service providers, paid television channels, insurance companies and other business organisations providing subscription-based services. Customer-churn prediction allows for targeted retention planning.

In this Assessment, you will build a machine learning (ML) model to predict customer churn using the principles of ML and big data tools.

As part of this Assessment, you will write a 1,000-word report that will include the following:

a) A predictive model from a given dataset that follows data mining principles and techniques;

b) Explanations as to how to handle missing values in a dataset; and

c) An interpretation of the outcomes of the customer churn analysis.

Please refer to the Task Instructions (below) for details on how to complete this task.

Task Instructions

1. Dataset Construction

Kaggle telco churn dataset is a sample dataset from IBM, containing 21 attributes of approximately 7,043 telecommunication customers. In this Assessment, you are required to work with a modified version of this dataset (the dataset can be found at the URL provided below). Modify the dataset by removing the following attributes: MonthlyCharges, OnlineSecurity, StreamingTV, InternetService and Partner.

As the dataset is in .csv format, any spreadsheet application, such as Microsoft Excel or Open Office Calc, can be used to modify it. You will use your resulting dataset, which should comprise 7,043 observations and 16 attributes, to complete the subsequent tasks. The ‘Churn’ attribute (i.e., the last attribute in the dataset) is the target of your churn analysis. Kaggle.com. (2020). Telco customer churn—IBM sample data sets. Retrieved from https://www.kaggle.com/blastchar/telco-customer-churn [Accessed 05 August 2020].

2. Model Development

From the dataset constructed in the previous step, present appropriate data visualisation and descriptive statistics, then develop a ‘decision-tree’ model to predict customer churn. The model can be developed in Jupyter Notebook using Python and Spark’s Machine Learning Library (Pyspark MLlib). You can use any other platform if you find it more efficient. The notebook should include the following sections:

a) Problem Statement

In this section, briefly state the context and the problem you will solve in the notebook.

b) Exploratory Data Analysis

In this section, perform both a visual and statistical exploratory analysis to gain insights about the dataset.

c) Data Cleaning and Feature Selection

In this section, perform data pre-processing and feature selection for the model, which you will build in the next section.

d) Model Building

In this section, use the pre-processed data and the selected features to build a ‘decision-tree’ model to predict customer churn.

In the notebook, the code should be well documented, the graphs and charts should be neatly labelled, the narrative text should clearly state the objectives and a logical justification for each of the steps should be provided.

3. Handling Missing Values

The given dataset has very few missing values; however, in a real-world scenario, data- scientists often need to work with datasets with many missing values. If an attribute is important to build an effective model and have significant missing values, then the data-scientists need to come up with strategies to handle any missing values.

From the ‘decision-tree’ model, built in the previous step, identify the most important attribute. If a significant number of values were missing in the most important attribute column, implement a method to replace the missing values and describe that method in your

4. Interpretation of Churn Analysis

Modelling churn is difficult because there is inherent uncertainty when measuring churn. Thus, it is important not only to understand any limitations associated with a churn analysis but also to be able to interpret the outcomes of a churn analysis.

In your report, interpret and describe the key findings that you were able to discover as part of your churn analysis. Describe the following facts with supporting details:

• The effectiveness of your churn analysis: What was the percentage of time at which your analysis was able to correctly identify the churn? Can this be considered a satisfactory outcome? Explain why or why not;

• Who is churning: Describe the attributes of the customers who are churning and explain what is driving the churn; and

• Improving the accuracy of your churn analysis: Describe the effects that your previous steps, model development and handling of missing values had on the outcome of your churn analysis and how the accuracy of your churn analysis could be improved.



Customers are the important entities of any organization that help them to make business and profit. So, every organization should have an intention to attract more customers in order to gain more profit. For Assignment Help, If the customers will be satisfied with the service, they will be retained in the business of the organization otherwise attrition may be seen (Eduonix, 2018). This is called customer churning that defines whether the customer has been retained or attrited from the business. In this paper, customer attrition will be determined with the application of machine learning.


The data has been collected from Kaggle regarding customer churn (BlastChar, 2017). The data contains the records of those customers who have left the company and for those also who have retained with the company by taking services and purchasing goods. The Data is shown below:

Fig-1: Customer Churn Dataset

Initially, after collecting the data, it has been seen that the data contains 7043 instances or rows and 21 features or columns. The number of rows and columns are shown below:

Fig-2: Initial Data Attributes

Now, five features namely Monthly Charges, Online Security, Streaming TV and Internet Service and Partner have been removed and the resulting dataset is now containing the following attributes:

Fig-3: Resulting Data Attributes

So, presently, the data contains 7043 instances and 16 columns.


The descriptive statistics of the dataset has been checked and the following outcome has been achieved (Learning, 2018).

Fig-4: Data Description

After that, the information of the data has been checked and the following outcome has been obtained:

Fig-5: Data Information

From the information of the data, it has been seen that all features are now in the form of the object (categorical).



In this paper, the problem statements have been prepared as follows:

1. What are the factors that are influencing customer churn?

2. How Decision Tree Classifier is helpful in determining the attrition of customers?


The data analysis has been performed based on some of the features. First, the analysis has been done to visualize the customer attrition based on gender (Sosnovshchenko, 2018). It has been seen that Male customer has more tendency to be attrited compared to female customers.

Fig-6: Analysis of Gender

Nest, the analysis has been done to visualize whether Online backup is related to customer attrition. The outcome of the analysis is shown below:

Fig-7: Analysis of Online Backup

The analysis has been performed on the paperless billing of the purchased products. It has been seen that those customers have been attrited who have not received paperless billing.

Fig-8: Analysis of Paperless Billing

The analysis has been performed on the payment method for the purchased products. It has been seen that those customers have been attrited who have used the electronic check.

Fig-9: Analysis of Payment Method


2.3.1 Data preprocessing and Cleaning

As seen earlier, the features of the data are categorical that cannot be fitted into machine learning (Learning, 2018). So, all the features have been preprocessed and converted to numerical data using data encoding as follows:

Fig-10: Data Preprocessing and Encoding

After preprocessing the data, the missing values have been found and it has been seen that there are no missing values in the data as follows:

Fig-11: Detecting Missing Values

So, there is no requirement for data cleaning as the data is already cleaned.

2.3.2 Feature Selection

Now, the correlation has been applied to check the relationship of the features with Churn. The outcome of the correlation is shown below in the form of a heatmap:

Fig-12: Correlation of Features

From the outcome of the correlation, the highly correlated features have been selected and shown below:

Fig-13: Finally Selected Features

So, these features will now be used as the final predictor features for the Decision Tree Classifier by retaining Churn as the target feature (Sosnovshchenko, 2018).


The predictor features have been selected from the correlation and the final dataset is shown below:

This data has been split into train and test sets as follows:

Fig-14: Data Splitting

The data splitting has been done using a 75-25 split ratio and the training dataset contains 5283 observations (through which the decision tree classifier will be trained) and the test set contains 1760 instances (through which the decision tree model will be tested) (Eduonix, 2018). In this test set, 1297 instances belong to “Not Churn” and 463 instances belongs to “Churn”. Now, the decision tree classifier has been applied with the following hyperparameter tuning and it has been trained with the training data:

• criterion='entropy'
• splitter='random'
• max_features='auto'
• random_state=10

After training the decision tree classifier, the model has been tested and the confusion matrix has been obtained as follows:

Fig-15: Confusion matrix

In this confusion matrix, It can be seen that 1110 instances out of 1297 instanced have been correctly classified as “Not Churn” and 302 instances out of 463 instances have been correctly classified as “Churn”. By considering the overall performances, 1412 instances have been correctly classified by attaining 80.23% accuracy, 81% precision, 80% recall and 80% f1-score. The performance overview is shown below in the form of a classification report (Lakshmanan, Robinson, & Munn, 2021).

Fig-16: Classification Report


The data has been selected from Kaggle regarding customer churn and analysed for the detection of customer attrition. In this context, the data has been preprocessed and the features have been selected. After preparing the data, it has been split into train and test sets and the decision tree classifier has been trained and tested accordingly and the performance of the classification has been achieved. The problem statements have been addressed as follows:

1. The features such as senior citizen, Dependents, online backup, DeviceProtection, TechSupport, StreamingMovies, Contract, PaperlessBilling, PaymentMethod have been seen to be the important features for the prediction of customer churn.

2. Decision Tree Classifier can be used to classify and predict the customer churn with 80.23% accuracy, 81% precision, 80% recall and 80% f1-score.



Read More

TITP105 The IT Professional Report Sample

COURSE: Bachelor of Information Technology

Assessment Task:

Students are required to analyse the weekly lecture material of weeks 1 to 11 and create concise content analysis summaries of the theoretical concepts contained in the course lecture slides.


Students are required to analyse the weekly lecture material of weeks 1 to 11 and create concise content analysis summaries (reflective journal report) of the theoretical concepts contained in the course lecture slides. Where the lab content or information contained in technical articles from the Internet or books helps to fully describe the lecture slide content, discussion of such theoretical articles or discussion of the lab material should be included in the content analysis.

The document structure is as follows (3500 Words):

1. Title Page

2. Introduction (100 words)

3. Background (100 words)

4. Content analysis (reflective journals) for each week from 1 to 11 (3200 words; approx. 300 words per week):

a. Theoretical Discussion

i. Important topics covered

ii. Definitions

b. Interpretations of the contents

i. What are the most important/useful/relevant information about the content?

c. Outcome

i. What have I learned from this?

5. Conclusion (100 words)

Your report must include:

• At least five references, out of which, three references must be from academic resources.
• Harvard Australian referencing for any sources you use.
• Refer to the Academic Learning Skills student guide on Referencing.


1. Introduction

The main aim to write this reflective journal report is to analyse the lectures of weeks 1 to 11 regarding ethics in information technology. This reflective journal will describe various roles for IT professionals and social, personal, legal and ethical impacts arising from their work. The role of the professional associations which are available to IT professionals will also be described in this reflective journal. For Assignment Help, It will assess the relationship between IT professionals and the issues of governance, ethics and corporate citizenship. I will critically analyse and review the IT professional Codes of Conduct and Codes of Ethics in this reflective journal report. This will help to develop a personal ethical framework.

2. Background

Technology offers various opportunities and benefits to people worldwide. However, it also gives the risk of abolishing one's privacy. Information technology must conduct business or transfer Information from one place to another in today's era. With the development of Information Technology, the ethics in information technology has become important as information technology can harm one's Intellectual property rights. Ethics among IT professionals can be defined as their attitude in order to complete something base on their behaviour. IT professionals need to have high ethics to process the data to control, manage, analyse, maintain, control, design, store and implement. Information Technology professionals face several challenges in their profession. It is their role and responsibility to solve these issues. The ethics of information technology professionals guide them to handle these issues in their work.

3. Content analysis

Week 1

a. Theoretical discussion

i. Important topics covered

In week 1, an overview of Ethics was discussed. Ethicalbehaviouris generally accepted norms that evolve according to the evolving needs of the society or social group who share similar values, traditions and laws. Morals are the personal principles that guide an individual to make decisions about right and wrong (Reynolds, 2018). On the other hand, the law is considered as a system of rules which guide and control an individual to do work.

ii. Definitions

Corporate Social Responsibility: Corporate social responsibility adheres to organisational ethics. It is a concept of management that aims to integrate social and environmental concerns for promoting well-being through business operations (Carroll and Brown, 2018, p. 39). Organisational ethics and employee morale lead to greater productivity for managing corporate social responsibility.

b. Interpretation

The complex work environment in today's era makes it difficult to implement Codes of Ethics and principles regarding this in the workplace. In this context, the idea of Corporate Social Responsibility comes. CSR is the continuing commitment by a business that guides them to contribute in the economic development and in ethical behaviour which have the potentiality to improve the life quality and living of the employees and local people (Kumar, 2017,p. 5). CSR and good business ethics must create an organisation that operates consistently and fosters well-structured business practices.

c. Outcome

From these lectures in the 1st week, I have learned the basic concepts of ethics and their role and importance in business and organisation. There are several ways to improve business ethics in an organisation by establishing a Corporate code of ethics, establishing a board of directors to set high ethical standards, conducting social audits and including ethical quality criteria in their organisation's employee appraisal. I have also learned the five-step model of ethical decision making by defining the problem, identifying alternatives, choosing an alternative, implementing the final decisions and monitoring the outcomes.

Week 2

a. Theoretical discussion

i. Important topics covered

In the 2nd week, the ethics for IT professionals and IT users were discussed. IT workers are involved in several work relationships with employers, clients, suppliers, and other professionals. The key issues in the relationship between the IT workers and the employer are setting and implementing policies related to the ethical use of IT, whistleblowing and safeguarding trade secrets. The BSA |The Software Alliance and Software and Information Industry Association (SIIA) trade groups represent the world's largest hardware and software manufacturers. Their main aim is to prevent unauthorised copying of software produced by their members.

ii. Definition

Whistle-blowing refers to the release of information unethically by a member or a former member of an organisation which can cause harm to the public interest(Reynolds, 2018). For example, it occurs when an employee reveals that their company is undergoing inappropriate activities (Whistleblowing: balancing on a tight rope, 2021).

b. Interpretation

The key issues in the relationship between IT workers and clients are preventing fraud, misinterpretation, the conflict between client's interests and IT workers' interests. The key issues in the relationship between the IT workers and the suppliers are bribery, separation of duties and internal control. IT professionals need to monitor inexperienced colleagues, prevent inappropriate information sharing and demonstrate professional loyalty in their workplace. IT workers also need to safeguard against software piracy, inappropriate information sharing, and inappropriate use of IT resources to secure the IT users' privacy and Intellectual property rights and ethically practice their professions so that their activities do not harm society and provide benefits to society.

c. Outcome

I have learnt the various work relationships that IT workers share with suppliers, clients, IT users, employers and other IT professionals.

Week 3

a. Theoretical discussion

i. Important topics covered

In week 3, the ethics for IT professionals and IT users further discussed extensively, and the solutions to solve several issues that IT professionals’ faces were discussed. IT professionals need to have several characteristics to face these issues and to solve them effectively. These characteristics are the ability to produce high-quality results, effective communication skills, adhere to high moral and ethical standards and have expertise in skills and tools.

ii. Definition

A professional code of ethics is the set of principles that guide the behaviour of the employees in a business(Professional code of ethics [Ready to use Example] | Workable, 2021). It helps make ethical decisions with high standards of ethical behaviour, access to an evaluation benchmark for self-assessment, and trust and respect with the general public in business organisations.

b. Interpretation

Licensing and certification increase the effectiveness and reliability of information systems. IT professionals face several ethical issues in their jobs like inappropriate sharing of information, software piracy and inappropriate use of computing resources.

c. Outcome

I have learned several ways that organisations use to encourage the professionalism of IT workers. A professional code of ethics is used for the improvement of the professionalism of IT workers. I have learnt several ways to improve their ethical behaviour by maintaining a firewall, establishing guidelines for using technology, structuring information systems to protect data and defining an AUP.

Week 4

a. Theoretical discussion

i. Important topics covered

In week 4, the discussion was focused on the intellectual property and the measurements of the organisations to take care of their intellectual properties. Intellectual property is the creations of the mind, like artistic and literary work, inventions, symbols and designs used in an organisation. There are several ways to safeguard an organisation's intellectual property by using patents, copyright, trademark and trade secret law.

ii. Definition

A patent is an exclusive right to the owner of the invention about the invention, and with the help of that the owner have the full power to decide that the how the inventios will be used in future(Reynolds, 2018). Due to the presence of Digital Millennium Copyright Act, the access of technology protected works has become illegal.. It limits the liability of ISPs for copyright violation by their consumers. Trademarks are the signs which distinguish the goods and services of an organisation from that of other organisations. There are several acts that protect Trademarks secrets, such as theEconomic Espionage Act and Uniform Trade Secrets Acts.

b. Interpretation

Open-source code can be defined by any program which have the available source code for modification or use. Competitive intelligence refers to a systematic process initiated by an organisation to gather and analyse information about the economic and socio-political environment and the other competitors of the organisation (Shujahat et al. 2017, p. 4). Competitive intelligence analysts must avoid unethical behaviours like misinterpretation, lying, bribery or theft. Cybercasters register domain names for famous company names or trademarks with no connection, which is completely illegal.

c. Outcome

I have learnt several current issues related to the protection of intellectual property, such asreverse engineering,competitive intelligence,cybersquatting, and open-source code. For example, reverse engineering breaks something down to build a copy or understand it or make improvements. Plagiarism refers to stealing someone's ideas or words without giving them credits.

Week 5

a. Theoretical Discussion

i. Important topics covered

The ethics of IT organisations include legal and ethical issues associated with contingent workers. Overview of whistleblowing and ethical issues associated with whistleblowing is being addressed (Reynolds, 2018). Green computing is the environmental and eco-friendly use of resources and technology(Reynolds, 2018). In this topic, there is the definition of green computing and what is initially the organisations are taking to adopt this method.

ii. Definition

Offshore Outsourcing: This is a process of outsourcing that provides services to employees currently operating in a foreign country(Reynolds, 2018). Sometimes the service is provided to different continents. In the case of information technology, the offshore outsourcing process is common and effective. It generally takes place when the company shifts some parts or all of its business operation into another country for lowering cost and improving profit.

b. Interpretation

The most relevant information about the context is whistleblowing and green computing. Whistleblowing is the method of drawing public attention to understand unethical activity and misconduct behaviour within private, public, and third sector organisations (HRZone. 2021).

c. Outcome

After reading the book, I have learned that green computing and whistleblowing are vital factors for the organisation's work. I have also learned about the diverse workforce in tech firms and the factors behind the trend towards independent contractors—the need and effect of H1-B workers in the organisation. Furthermore, the legal and ethical issues associated with green computing and whistleblowing have also been made.

Week 6

a. Theoretical discussion

i. Important topics covered

In this chapter, the importance of software quality and important strategies to develop a quality system. Software quality is defined as the desirable qualities of software products. Software quality consists of two main essential approaches include quality attributes and defect management. Furthermore, the poor-quality software also caused a huge problem in the organisation (Reynolds, 2018). The development model including waterfall and agile development methodology. Lastly, the capability maturity model integration which is a process to improve the process.

ii. Definition

System-human interface: The system-human interface helps improve user experience by designing proper interfaces within the system(Reynolds, 2018). The process facilitates better interaction between users and machines. It is among the critical areas of system safety. The system performance depends largely upon the system-human interface. The interaction between humans and the system takes place through an interaction process. Better interaction improves UX.

b. Interpretation

The useful information about the context is the software quality and the important strategies to improve the quality of software. The Capability Maturity Model Integration is the next generation of CMM, and it is the more involved model incorporating the individual disciplines of CMM like system engineering CMM and people CMM (GeeksforGeeks. 2021).

c. Outcome

After reading the context, I have concluded that software quality is one of the essential elements for the development of business. The software derives predictability from improving productivity in the business. The software quality decreases the rework, and the product and services are delivered on time. The theories and facts that are involved in developing the strategies that are involved in developing the software quality in the organisation.

Week 7

a. Theoretical discussion

i. Important topics covered

In this context, it will discuss privacy, which is one of the most important features for the growth and development of individuals and organisations. The right, laws, and various strategies to mitigate ethical issues are adopted (Reynolds, 2018). The e-discovery can be defined as the electronic aspect ofidentifying, collecting, and producing electronically stored information for the production of investigation and lawsuit.

ii. Definition

Right of Privacy: The privacy of information and confidentiality of vital information comes under the right of privacy(Reynolds, 2018). In information technology, the privacy right helps in managing the access control and provides proper security to the user and system information. This also concerns the right not to disclose an individual's personal information to the public.

b. Interpretation

The most relevant in the context are privacy laws that are responsible for the protection of individual and organisation's rights. The protection laws include the European Union data protection directive, organisation for economic cooperation and development, and general data protection regulation that protect the data and information of the individual and company (Reynolds, 2018). Furthermore, the key and anonymity issues that exist in the workplace like cyberloafing. The employees exercised the practice to use the internet access for personal use without doing their work.

c. Outcome

I have learned from this context that privacy is required for every organisation to protect the private information about the personal information and credentials that are present in the company—privacy along with developed technology that secures the data and information about the organisation. I have also got information about the ways and technological development to protect the data.

Week 8

a. Theoretical discussion

i. Important topics covered

In this context, it is discussed freedom of expression, meaning the right to hold information and share decisions without any interference. Some of the vital issues of freedom of expression include controlling access to information on the internet, censorship to certain videos on the internet, hate speech,anonymity on the internet, pornography, and eradication of fake news often relevant on the internet (Reynolds, 2018).

ii. Definition

Freedom of Expression: Freedom of expression denotes the ability to express the thoughts, beliefs, ideas, and emotions of an individual or a group (Scanlon, 2018, p. 24). It is under the government censorship which promotes the right to express and impart information regardless of communication borders which include oral, written, the art of any other form.

b. Interpretation

The most important information regarding the context is John Doe Lawsuits. It is a law that helps to identify the anonymous person who is exercising malicious behaviour like online harassment and extortion. Fake news about any information that is irrelevant, which are however removed by several networking websites. However, the fake news sites and social media websites are shared by several videos and images cause confusion and misinterpretation regarding a particular subject (Reynolds, 2018).

c. Outcome

After reading the book, I have concluded that the internet is a wide platform where several malicious practices are carried out, like fake news, hate speech, and many other practices practised on the internet. I have also gained information about several laws and regulations to protect the right and regulations on the internet, including the telecommunication act 1996 and the communication decency act 1997.

Week 9

a. Theoretical discission

i. Important topics covered

In this context, it will be discussed about cyberattacks and cybersecurity. Cyberattacks are an assault launched by an anonymous individual from one or more computers using several network chains (Reynolds, 2018). A cyber-attack can steal personal information a can disable the computer. On the other hand, cybersecurity is the practice of protecting information from cyberattacks. There are several methods to protect the internet from malware, viruses and threats.

ii. Definition

Cyber espionage: This is the process of using computer networks for gaining illicit access to confidential information(Reynolds, 2018). The malicious practice increases the risk of data breaching. It steals sensitive data or intellectual property, typically preserved by a government entity or an organisation (Herrmann, 2019, p. 94). Cyber espionage is a threat to IT companies, especially as it targets the digital networks for information hacking.

b. Interpretation

The most important aspect in this context is intrusion detection system, proxy servers like a virtual private network. The intrusion detection system is the software that alerts the servers during the detection of network traffic issues. The proxy servers act as an intermediator between the web browser and another web server on the internet. The virtual private network enables the user to access the organisation's server and use the server to share data by transmitting and encryption over the Internet (Reynolds, 2018).

c. Outcome

After reading the entire context, I have gained information about several cyberattacks and cybersecurity. Cyber attackers like crackers, black hat hackers, malicious insiders, cyberterrorists, and industrial spies (Reynolds, 2018). Cybersecurity like CIA security trial. Department of homeland security, an agency for safer and secure America against cyber threats and cyberterrorism. The transport layer security is the organisation to secure the internet from cyber threats between the communicating application and other users on the Internet (Reynolds, 2018).

Week 10

a. Theoretical discussion

i. Important topics covered

In this context, it is discussed about social media and essential elements associated with social media. Social media can be defined as modern technology that enhances the sharing of thoughts, ideas, and information after establishing various networks and communities (Reynolds, 2018). Several companies adopt social media marketing to sell their services and products on the internet by creating several websites across the Internet.

ii. Definition

Earned Media: It is observed in brand promotions in organisations where media awareness awarded through promotion(Reynolds, 2018). It is also considered the organic media, which may include television interviews, online articles, and consumer-generated videos. It is not a paid media; rather, it is voluntarily awarded to any organisation. The earned media value is calculated through website referrals, message resonance, mentions, and article quality scores.

b. Interpretation

The most important aspect of social media marketing where the internet is used to promote products and services. As per the sources, global social media marketing spends nearly doubled from 2014 to 2016, increasing from 15$ billion to 30$ billion—organic media marketing and viral marketing as one important aspect of social media marketing.

c. Outcome

I have gained much information about social media and elements of social media marketing, which encourages marketers to sell their products and services to another individual across the internet. Social media is a vast platform that has both advantages and disadvantages aspect. The issues regarding social media including social networking ethical issues that are causing harmful threats and emotional distress on the individual. There is a solution to these issues, which is adopted by several organisations like fighter cyberstalking, stalking risk profile, and many more.

Week 11

a. Theoretical discussion

i. Important topics covered

This context will eventually discuss the impact of information technology on society. The information impacts the gross domestic product and standard of living of people residing in developed countries. Information technology has made the education system more productive and effective. The process of e-learning has allowed the students to study from their homes. The health care system is also affected by information technology.

ii. Definition

Robotics: It is the design and construction of machines (robots) for performing tasks done by human beings (Malik and Bilberg, 2018, p. 282). It promotes autonomous machine operating systems for easing the burden and complexity of human labour. In this case, artificial intelligence helps to improve the development process of machines by incorporating the machine learning process. Automobile manufacturing industries use robotics design for safeguarding humans from environmental hazards.

b. Interpretation

The most information aspect of the topic is the artificial intelligence and machine learning have impacted the growth of IT. Artificial intelligence includes data and human intelligence processes that include activities like learning, reasoning, and self-correction. Machine learning is the process to talk with the technology through machine languages.

c. Outcome

I have gained much information about information technology and its impact on the organisation and people. The innovation and development occurred vastly due to the effect of social media.

4. Conclusion

It is to be concluded that this reflective journal report describes all the aspects of ethics in information technology by providing an understanding of the ethical, legal and social implications of information technology that IT professionals need to nurture in their professional work. Critical analysis of the privacy, freedom of expression, common issues of IT professionals, solutions of these issues are reflected in this journal report. The journal report also attempts to address the ethical issues in the IT workplace. An understanding of IT and ethics needed in IT professionals to achieve success is reflected in this journal report.


Read More

MITS4003 Database Systems Report 3 Sample


This assessment item relates to the unit learning outcomes as in the unit descriptor. This assessment is designed to improve student knowledge through further research on recent trends to demonstrate competence in tasks related to modelling, designing, implementing a DBMS, Data Warehousing, Data management, Database Security. Also, to enhance students experience in researching a topic based on the learning outcomes they acquired during lectures, activities, assignment 1 and assignment 2. Furthermore, to evaluate their ability to identify the latest research trends and writing a report relevant to the Unit of Study subject matter. This assessment covers the following LOs.

1. Synthesize user requirements/inputs and analyse the matching data processing needs, demonstrating adaptability to changing circumstances;

2. Develop an enterprise data model that reflects the organization's fundamental business rules; refine the conceptual data model, including all entities, relationships, attributes, and business rules.

3. Derive a physical design from the logical design taking into account application, hardware, operating system, and data communications networks requirements; further use of data manipulation language to query, update, and manage a database

4. Identify functional dependencies, referential integrity, data integrity and security requirements; Further integrate and merge physical design by applying normalization techniques;

5. Design and build a database system using the knowledge acquired in the unit as well as through further research on recent trends to demonstrate competence in various advanced tasks with regard to modelling, designing, and implementing a DBMS including Data warehousing, Data Management, DB Security.

Note: Group Assignment. Maximum 4 students are allowed in a group.


These instructions apply to both the Report and Presentation assessments. For this component you will be required to select a published research article / academic paper which must cover one or more of the topics including Database modelling, designing, and implementing a DBMS including Data warehousing, Data Management, DB Security, Data Mining or Data Analysis. The paper you select must be directly relevant to these topics. The paper can be from any academic conference or other relevant Journal or online sources such as Google Scholar, academic department repositories etc. All students are encouraged to select a different paper; and it must be approved by your lecturer or tutor before proceeding. In case two groups are wanting to present on the same paper, the first who emails the lecturer or tutor with their choice will be allocated that paper.

Report - 20% (Due week 12)

For this component you will prepare a report or critique on the paper you chose as mentioned above. Your report should be limited to approx. 1500 words (not including references).

Use 1.5 spacing with a 12-point Times New Roman font. Though your paper will largely be based on the chosen article, you can use other sources to support your discussion. Citation of sources is mandatory and must be in the Harvard style.

Your report or critique must include:

Title Page: The title of the assessment, the name of the paper you are reviewing and its authors, and your name and student ID.

Introduction: A statement of the purpose for your report and a brief outline of how you will discuss the selected article (one or two paragraphs). Make sure to identify the article being reviewed.

Body of Report: Describe the intention and content of the article. Discuss the research method (survey, case study, observation, experiment, or other method) and findings. Comment on problems or issues highlighted by the authors. Discuss the conclusions of the article and how they are relevant to what you are studying this semester.

Conclusion: A summary of the points you have made in the body of the paper. The conclusion should not introduce any ‘new’ material that was not discussed in the body of the paper. (One or two paragraphs)

References: A list of sources used in your text. Follow the IEEE style. The footer must include your name, student ID, and page number.



The article “An overview of end-to-end entity resolution for big data.“ will give a brief description of the entity resolution for big data which is being rebuked and critically analyzed. The paper will provide a comprehensive view that includes the field of entity resolution and focus on the application with context to big data. For Assignment Help, This research article will propose the framework or the entity resolution on behalf of big data which entitles the identification and collapse of records in real-world entities. This Framework will also design and challenge the proposed big data on behalf of different considering techniques and evaluating the proposed Framework with the help of real-world data sets. This article will cover topics such as database modeling data management and data analysis. It will be more relevant to the topics that will be presented by the framework to design and implement the system which can handle the challenges of Designing and considering the entity resolution more accurately and efficiently.

The intention of the Article

The article is likely to produce the intention of the comprehensive view and analysis which will be conducted on the end entity resolution and the techniques that are specifically organized and implemented for the big data scenarios.

- The research also leads to the importance and the challenges that are been faced while using data resolution to solve the big data issues. The accurate integration of the data and cleansing is been applied so that the impact of data characteristics can be processed on the entity resolution [6].

- The article also explores the Different techniques and approaches that will be used in resolving the big data which will be implemented with the help of rule base methods such as machine learning algorithms or probabilistic models to design and handle the big data.

- The data preprocessing is also been covered for the effective and necessary entity resolution in the big data. This also includes the normalization and analysis of the data warehouse to propose the data modeling for high-quality results.

- The article also optimizes the scalability and the efficiency of the data that is been analyzed to explore the techniques in parallel and distributed processing. The data partitioning with the entity resolution process plays a major role when it comes to the large-scale data set.

- The evaluation and the applications of the case study also play a major role in the resolution of the techniques that leads to the successful implementation of big data scenarios such as various domains of Healthcare or e-commerce.


- The author has specified the big data error concerning the government and the specific organization that increases the internal and external aspects.

- The Entity resolution mainly AIMS to the real-world entity that is been structured and stored in the relational tables of the big data to consider the scenarios.

- The author has illustrated the description of the movie directors and the places from the different knowledge bases and the entity description is being defected in the tabular format.

- The classification of the pairs and the description is being assumed to process the in-compasses task and indexing to match the data.

Figure 1 Movies, Directors, and Locations from DBpedia (blue) and Freebase (red). Note that e1, e2, e3 and e4 match with e7, e5, e6 and e8, respectively.

- The author includes the survey about the big data characteristics which shows the algorithm and the implemented task and the workflow of the data. This includes the volume variety velocity as the characteristics of the big data [2].

Case Study Based on Data Modelling and Data Analytics

- The big data entry resolution considered the case study about the continuous concern and improving the scalability of this technique for increasing the volume of entities using the massively parallel implementations with the help of data Modelling and analysis.

- The Entity description is being evaluated with high veracity which is been resolved by matching the Data Analytics value and traditionally duplicating the techniques. With the help of analysis the conceived processing of the structure data can be educated pre-process to data warehouse and enhanced the blocking keys to rule the different types of challenging data.

- The below figure depicts the different types of similarities and the entities with the benchmark data set and considered the restaurant or other key parameters that are involved with the dot corresponding to each other of the matching pair [4].

- The horizontal accessing of the similarity is described with the vertical and maximum similarities are based on the entity neighbors. The value-based similarities are being proposed on the big data entities which are being used to improvise the data quality and the data modeling techniques to compile the integrity of the data management.

Figure 2 Value and neighbor similarity distribution of matching entities in 4 established, real-world datasets.


- Data Modelling

The article considered data modeling as an approach for entity resolution in the context of big data. As this covered the techniques of representing the structure data which help in capturing the attributes and relationship with the attitude resolution. The schema design and the data integration also play a major role in the data representation of formulating big data [1].

- Data Analysis

The technique leads to the discussion and the observation of measuring the feature extraction and statistical methods which help in comparison the matching the entities. This also covers the algorithm which is based on machine learning and the data mining to the employee or deploying the Entity resolution with the clustering and classification models.

- Data Management

The Strategies and processing of the large data is been managed during the entity resolution process. This technique leads to the handling of noise and inconsistency with the missing values of the big data full stop this also leads to the exploring the index and the storage mechanism which help in facilitating the retrieval of the matching entity full stop the parallel and the distributed processing leads to the scalability and challenging resolution of the big data.

Conflicts or Issues

- Heterogenous Data Sources

The environment of the analyzing technique and big data necessitate the diversification of sources, such as databases and sensor networks. The entities' integrity and rec counseling have been viewed as a problem or conflict to suggest difficulties arising from differences in data formats and schemas [5].

- Dimensionality

The numerical attribute or features needed to handle dimensional data are the data and entities' dimensions. In order to avoid the dimensionality curse, the most effective method is taken into consideration, as are the featured engineers and other computations.

- Computational Efficiency

The entity resolution is being performed and processed with the computational demand of the algorithms which are considered as the contract of parallel processing technique. This distributed computational and the Framework are necessary which achieve the scalability and entity of the big data.

Similarities of research with the Study of Semester

- As the research is been considered the similarity of developing the knowledge regarding the user requirements and analysis to match the data and processing the needs to demonstrate the circumstances.

- With the help of this research, the enterprise data model and the reflects fundamental business rule is being conceptually Defined by the data modeling which includes the attribute and the business rules.

- The physical designing and the logical designing is been taken under the implementation to account for the communication with the network requirement and manipulating the language to manage the database [3].

- The identification of the functionality and its dependency is the referential integration and the Data integrity to provide the security requirements for merging the physical design and applying the normalization technique.

- The building of the database and the knowledge is being acquired by further research to demonstrate the competency in the Advanced Task of modeling and designing the implementation of data warehouse and Management.


The report has deeply explained about the theoretical aspect of the and to end resolution of the big data with the implementation methodology of data Modelling and analysis to manage the data. The specific methodology and case study has been considered in the article with the general representation of the concluded entity and algorithms that is been applied. The problem has been observed with the recent years of data-intensive and the description of the real world entities with the government or the corporate-specific data sources for stop the view of entity resolution with the engineering aspect and the task has also been implemented as a theoretical aspect of considering the certain algorithms. The big data and the area of open-world systems have also allowed the different blocking and matching algorithms to easily integrate the third-party tools for data exploration and sampling the Data Analytics.



Read More

DBFN212 Database Fundamentals Report 4 Sample


Students are required to analyse the weekly lecture material of weeks 1 to 11 and create concise content analysis summaries of the theoretical concepts contained in the course lecture slides.

Where the lab content or information contained in technical articles from the Internet or books helps to fully describe the lecture slide content, discussion of such theoretical articles or discussion of the lab material should be included in the content analysis.

The document structure is as follows (2500 Words):

1. Title Page

2. Introduction and Background (85 words)

3. Content analysis (reflective journals) for each week from 1 to 11 (2365 words; 215 words per week):

a. Theoretical Discussion

i. Important topics covered

ii. Definitions

b. Reflection on the Contents

i. Create a brief scenario and apply the database concept learnt on it. You can use the same scenario for every lecture or modify it if needed. (Note: For week 1, you can omit providing a scenario, instead give your interpretation of the concepts.)

c. Outcome

i. What was the objective of the database feature/concept learnt?
ii. How the learnt concept/feature improves your understanding of the database systems.

4. Conclusion (50 words)

Your report must include:

• At least five references, out of which, three references must be from academic resources.
• Harvard Australian referencing for any sources you use.
• Refer to the Academic Learning Skills student guide on Referencing



It is significant to reflect on the overall learning as it assists in gaining better insight into what has been learned during the course. For Assignment Help, The present report aims to describe the primary aspects related to database technology and database management, and it also aims to critically evaluate database management and database technology. The report also aims to apply concepts related to transaction processing and concurrency in systems of multi-user database. The report also aims to analyse primary issues related to data retrieval, access, storage, privacy and ethics.

Content Analysis

Week 1

A. Theoretical Discussion

The unit assisted in developing better insight into how a professional career can be developed in the field of database management. The insight about various disadvantages of database systems was gained during the unit, and some of the disadvantages involve complexity in management, increased costs, dependence on the vendor, maintaining currency and frequent replacement and upgrade cycles (Naseri and Ludwig, 2010).

B. Reflection on the Contents

I learned raw facts make up data, which is typically recorded in a database. The database structure is defined by database design. It's possible to categorise it based on the number of users, location, and data usage and structure. DBMSs were created to overcome the inherent flaws of the file system. Manual and electronic file systems gave way to databases. The administration of data in a file system has some constraints (Tan et al., 2019).

C. Outcome

The distinction between data and information was defined. There was a discussion about what a database is, the different varieties, and why they are important decision-making tools. Also saw how file systems evolved into current databases. Complete learning about the significance of database design was gathered. The major topic was the database system's main components. During the session learned the functions of a database management system in detail (DBMS).

Week 2

A. Theoretical Discussion

Different data views (local and global) and the level abstraction of data influence modeling of data requirements. In the real world a data model is a representation of a complicated data environment. The learning during the unit enhances knowledge of different database systems and models that are practically used by organisations in the business (Codd, 2015).

B. Reflection on the Contents

I learned that data model of a complicated data environment is represented in the real world. Relational, network, hierarchical, extended relational data model, and object-oriented data models are only a few examples. There are multiple database models available, and some of the examples are Cassandra, CouchBase, CouchDB, HBase, Riak, Redis and MongoDB. The MongoDB database is used by e-commerce organisations like eBay and Tesco, whereas Amazon uses its own Amazon SimpleDB, which is a document-oriented database (Dong and Qin, 2018).

C. Outcome

Basic data modelling building elements were taught to the students. Data modelling and why data models are important were discussed during the unit. During the event, the participants had a deeper knowledge of how the key data models evolved. Business rules were developed by the tutor, along with how they affect database design. The teacher demonstrated how data models are classed according to their level of abstraction. In addition, the event showcased new alternative data models and the demands they address.

Week 3

A. Theoretical Discussion

A relational database's basic building pieces are tables. A large amount of the data manipulation effort is done behind the scenes by a relational database.

B. Reflection on the Contents

A database of relational nature organises the data which can be related or linked n the basis of common data to each of the unit and that is what I learned. This ability assists me in retrieving a completely new table from the information in one or more than one table with the help of a single query. The popular examples of standard databases of relational nature involve Oracle Database, Microsoft SQL Server, IBM and MySQL. The database of the cloud relational system involves Google Cloud SQUL, Amazon RDS (Relational Database Services) (Song et al., 2018).

C. Outcome

The tutor went through the core components of the relational model as well as the content, structures, and properties of a relational table. The teacher also went through how to manipulate relational table contents using relational database operators. The logical structure of the relational database model was discussed in class.

In a relational database, the function of indexing was explained. The teacher also showed how the relational database model handles data redundancy. The session began with the identification of acceptable entities, followed by a discussion of the relationships between the entities in the relational database model. The components and function of the data dictionary and system catalogue were covered in class.

Week 4

A. Theoretical Discussion

The ERM represents the conceptual database as seen by the end-user with ERDs. Database designers are frequently obliged to make design compromises, regardless of how effectively they can generate designs that adhere to all applicable modelling norms (Pokorny, 2016).

B. Reflection on the Contents

I learned Conceptual data models at a high level provide ideas for presenting data in ways that are similar to how people see data. The entity-relationship model, which employs key concepts such as entities, attributes, and relationships, is a good example. The primary use of such data is done in the sales department of the business as it allows the people in business to view expenses data, sales data and to analyse total demand. It is also used in libraries where a system has the detail about the books, borrower entities and library (Das et al., 2019).

C. Outcome

The instructor explained how the database design process refines, defines, and incorporates relationships between entities. The teacher talked about the basic characteristics of entity-relationship components and how to investigate them. The impact of ERD components on database design and implementation was also examined. We learned about relationship components after finishing this chapter. There was some discussion about how real-world database design frequently necessitates the balancing of competing aims.

Week 5

A. Theoretical Discussion

Keys of surrogate primary are beneficial when there is no natural key that can be used as a primary key, the composite primary is the primary key that contains various data kinds, or when the primary key is too long to be used. Entity supertypes, subtypes, and clusters are used in the extended entity-relationship (EER) model to provide semantics to the ER model (Lang et al., 2019).

B. Reflection on the Contents

This is an example of a "sub-class" relationship which I developed after learning. We have four staff here: an engineer, a technician, and a secretary. The employee is the super-class of the other three sets of individual sub-classes, which are all subsets of the Employee set.

Employee 1001 will have the attributes eno, salary, typing speed and name because it is a sub-class entity that inherits all of the attributes of the super-class. A sub-class entity has a relationship with a super-class entity. For example, emp 1001 is a secretary with a typing speed of 68. Emp number 1009 is a sub-class engineer whose trade is "Electrical," and so on.

C. Outcome

The properties of good primary keys were discussed, as well as how to choose them. The unit aided in the comprehension of flexible solutions for unique data-modelling scenarios. In an entity-relationship diagram, the class learned about entity clusters, which are used to depict several entities and relationships (ERD). The instructor went over the key extended entity-relationship (EER) model constructs and how EERDs and ERDs represent them.

Week 6

A. Theoretical Discussion

The designer can use the data-modelling checklist to ensure that the ERD meets a set of basic standards. The more tables you have, the more I/O operations and processing logic you'll need to connect them.

B. Reflection on the Contents

Normalisation is a technique for creating tables with as few data redundancies as possible and I learned this during the module. When a table is in 2NF and has no transitive dependents, it is in 3NF. When a table is in 1NF and has no partial dependencies, it is in 2NF. When all, attributes are dependent and key attributes are defined on the primary key, a table is in 1NF.

C. Outcome

We understood the use of a checklist of data modelling to check that ERD meets a set of minimum demands. The teacher also assisted with investigations of situation that demands denormalisation to efficiently generate information. The class learned about the application of normalisation rules to correct structures of the table and to evaluate the structures of tables. The class discussed the role of normalisation in the process of designing data. The teacher also discussed the normal forms known as 4NF, 1NF, 2NF, BCNF and 3NF. The class discussed the way how normal forms can be transformed goes from lower average forms till the normal forms that are high.

Week 7

A. Theoretical Discussion

To limit the rows affected by a DDL command, use the WHERE clause with the UPDATE, SELECT, and DELETE commands. When it's necessary to process data depending on previously processed data, sub-queries and correlated queries are employed. Relational set operators in SQL allow you to combine the results of two queries to create a new relation (Alvanos Michalis, 2019).

B. Reflection on the Contents

I learned that All RDBMS vendors support the ANSI standard data types in different ways. In SQL, the SELECT statement is the most used data retrieval instruction. Inner joins and outer joins are two types of table joining operations. A natural join avoids duplicate columns by returning all rows with matching values in the matching columns (Koch & König, 2018).

C. Outcome

In this week, we learned retrieval of specified columns from the data of a large database. The class learned about how to join different table in a single query of SQL. There was an in-depth discussion about the restriction of retrieval of data to rows that aligns with difficult sets of criteria. The class also learned about the aggregation of data through rows and their groups. The teacher helped us create preprocess data subqueries for the purpose of inclusion in other queries. In the class, we learned to use and identification of different functions of SQL for numeric, string and manipulation of data. There was a depth discussion about the crafting of queries in SELECT.

Week 8

A. Theoretical Discussion

A cursor is required when SQL statements in SQL/PL code are meant to return several values. A stored procedure is a set of SQL statements with a unique name. The SQL embedded refers to SQL statements use within an application of programming languages such as Visual Basic, NET, Java, COBOL or C# (Lindsay, 2019).

B. Reflection on the Contents

All RDBMS vendors support the ANSI standard data types in different ways. I learned Tables and indexes can be created using the basic data definition procedures. You can use data manipulation commands to add, change, and delete rows in tables. Sequences can be used to generate values to be allocated to records in Oracle and SQL Server. Views can be used to show end-users subsets of data, typically for security and privacy concerns (Raut, 2017).

C. Outcome

We learned to manipulate the data using SQL and also how to delete, update, insert rows of data) In the module, we also gained knowledge to create updatable and database views. SQL also helped me in creating tables through the use of subquery. Throughout the module, we learned to modify, add and remove the tables, constraints and columns. The database views are created using SQL by including updatable views. Also, by studying the whole module, we learned the use of procedural language that is SQL/PL to create, store, triggers and SQL/PL functions. Also, the module taught me to create embedded SQL.

Week 9

A. Theoretical Discussion

An information system is intended to assist in the transformation of data into information as well as the management of information and data both. The SDLC (Systems Development Life Cycle) chronicles an application's journey through the information system (Mykletun and Tusdik, 2019).

B. Reflection on the Contents

I learned the SDLC (systems development life cycle) is a conceptual model that is used in the management of a project that discusses the involved stages in a development project of the information system from an initial study of feasibility through completed application maintenance. The SDLC can be made use of systems that are not technical and technical (Omollo and Alago, 2020).

C. Outcome

The module helped in enhancing knowledge about database design to build up the information system. The five phases are also explained about the System Development Life Cycle. The module explained the six phases in the designing of the database life cycle framework. The revision and evaluation within the DBLC and SDLC framework were learned. We learned bottom-up and top-down approaches in designing the database. Also, the module helped in distinguishing between decentralised and centralised in conceptualising the designing of the database.

Week 10

A. Theoretical Discussion

COMMIT, which saves changes to disc, and ROLLBACK, which restores the database which is set previously, are two SQL statements that support transactions. Concurrency control coordinates the execution of many transactions at the same time. The transactions have four major elements that are consistency, atomicity, durability and location. Database recovery returns a database to a previous consistent state from a given state.

B. Reflection on the Contents

I learned the recovery management in the DBMS allows to restores the database to correct conditioning of functioning and restarting the transactions of processing. The aim of database transaction maintenance integrity is to make sure there are no changes that are unauthorised changes that happen either through system error or user interaction (Semancik, 2019).

C. Outcome

The management process has helped me in gaining database transactions. The module described the various properties of transactions through the database. During the unit concurrency control in maintaining the integrity of the database. Also, the locking methods are taught during the lecture that can be used in concurrency control. In the lecture, we gained knowledge related to stamping methods for the control of concurrency. In the sessions, optimistic methods are used for controlling the concurrency. Also, the module explained the transaction isolation at the ANSI level. The module also discussed the recovery of the database in managing the integrity of the database.

Week 11

A. Theoretical Discussion

SQL data services (SDS) are a data management service cloud computing-based that offers enterprises of all sizes relational storage of data, local management, and ubiquitous access. The Extensible Markup Language (XML) promotes B2B and other data exchanges over the Internet (Jones, 2019).

B. Reflection on the Contents

Microsoft database connectivity interfaces are market leaders, with support from the majority of database manufacturers. I learned the connection interface offered by the database exclusive and vendor to that vendor is referred to as native database connectivity. The means by which application programmes connect to and communicate with data repositories are referred to as database connection.

C. Outcome

In the class, there was an explanation of standard interfaces of database connectivity. The teacher described the features and functionality of various connectivity technologies of the database. There was a discussion of OLE, ODBC, ADO.NET and JDBC. In class, there was an in-depth discussion about how database middleware which is used to integrate database through the use of the Internet. The teacher also helped to identify the services provided by servers of the web application. The teacher also discussed how XML (Extensible Markup Language) is used for the development of web database (Sharma et al., 2018).


The report described primary database management and technology aspects. The report also critically evaluated the database technology and data management. The report also applied concepts related to the processing of transaction and concurrency in systems of multi-user database. The report also focused on evaluating major challenges related to access, retrieval, privacy, ethics and storage.


Read More

COIT20253 Business Intelligence Using Big Data Report Sample

Assessment Task:

Assignment 1 is an individual assessment. In this assessment, you are assigned tasks which assess your unit knowledge gained between weeks 1 and 5 about big data and how it can be used for decision making in any industry. All students will have to write a “professional” business report with Executive summary, Table of Content (MS generated); Introduction; Discussion; Conclusion; Recommendations and References.

Please note that ALL submissions will be checked by a computerised copy detection system and it is extremely easy for teaching staff to identify copied or otherwise plagiarised work.

• Copying (plagiarism) can incur penalties, ranging from deduction of marks to failing the unit or even exclusion from the University.

• Please ensure you are familiar with the Academic Misconduct Procedures. As a student, you are responsible for reading and following CQUniversity’s policies, including the Student Academic Integrity Policy and Procedure.

In this assessment, you are required to choose one of the following industries: Healthcare, Insurance, Retailing, Marketing, Finance, Human resources, Manufacturing, Telecommunications, or Travel.

This assessment consists of two parts as follows:

Part A - You are required to prepare a professional report on WHY Big Data should be integrated to any business to create opportunities and help value creation process for your chosen industry.

Part B - You need to identify at least one open dataset relevant to the industry and describe what opportunities it could create by using this dataset. You can access open data source from different websites. Please try finding it using Google.

In Part A, you will describe what new business insights you could gain from Big Data, how Big Data could help you to optimise your business, how you could leverage Big Data to create new revenue opportunitiesfor your industry, and how you could use Big Data to transform your industry to introduce new services into new markets. Moreover, you will need to elaborate how you can leverage four big data business drivers- structured, unstructured, low latency data and predictive analytics to create value for your industry. You are also required to use Porter’s Value Chain Analysis model and Porter’s Five Forces Analysis model to identify how the four big data business drivers could impact your business initiatives.


Part A


The integration of big data has emerged as a transformative force in today's rapidly evolving business landscape which has reshaped industries and redefined organizational paradigms. The sheer volume and variety of data available have paved the way for unprecedented insights and opportunities. For Assignment Help, This report will explore the multifaceted impact of big data on business initiatives which elucidate how four key drivers i.e., structured, unstructured, low latency data and predictive analytics used to intersect with Porter's Value Chain Analysis and Five Forces Analysis. The report aims to provide a comprehensive understanding of how big data drivers foster value creation by delving into these intricate interactions which can enhance operational efficiency and steer strategic decision-making across industries.

Big Data Opportunities

Enhanced Customer Insights and Personalization:

Big data analytics offers the power to delve into expansive customer datasets which can help to unveil new insights into preferences, behaviors, and trends (Himeur et al. 2021). Businesses can create personalized experiences that resonate deeply with their customers by harnessing this data. Personalization has cultivated a strong bond between the business and its customers from tailored product recommendations based on browsing history to precisely targeted marketing campaigns. This not only amplifies customer satisfaction but also fosters loyalty and advocacy which can be considered as a major parameter to drive sustained revenue growth. Personalized experiences have become a defining factor in competitive differentiation in industries such as e-commerce, retail, and hospitality.

Operational Efficiency and Process Optimization:

Big data's analytical prowess extends to scrutinizing intricate operational processes. Organizations can leverage this capability to identify inefficiencies, bottlenecks, and areas for improvement. Companies gain a holistic view of their workflows by analyzing operational data that can help to enable them to streamline operations along with reducing resource wastage and enhancing overall productivity. Integrating real-time and low-latency data empowers businesses to make agile decisions, ensuring prompt adaptation to dynamic market shifts. Industries spanning manufacturing, logistics, and healthcare can reap significant benefits from this opportunity, resulting in cost savings and improved service delivery.

Predictive Analytics for Proactive Decision-making:

The integration of predictive analytics into big data strategies empowers industries to foresee future trends and outcomes (Stylos, Zwiegelaar & Buhalis, 2021). This predictive prowess holds applications across various sectors, from retail to finance. By analyzing historical data and identifying patterns, businesses can forecast demand, anticipate market shifts, and assess potential risks. Armed with these insights, organizations can make proactive decisions that minimize risks and capitalize on emerging opportunities. In sectors where timeliness is paramount, such as finance and supply chain management, predictive analytics offers a competitive edge.

Innovation and New Revenue Streams:

Big data serves as a wellspring of inspiration for innovation. Industries can leverage data-driven insights from customer feedback, market trends, and emerging technologies to create novel products and services. By identifying gaps in the market and understanding unmet needs, businesses can design solutions that resonate with consumers. These innovations not only open new revenue streams but also position organizations as market leaders. Industries as diverse as technology, healthcare, and agriculture can leverage this opportunity to foster disruptive ideas that cater to evolving demands.

Value Creation Using Big Data

Enhanced Decision-making and Insights:

Big data equips industries with a wealth of information that transcends traditional data sources. By amassing vast volumes of structured and unstructured data, businesses can extract actionable insights that drive informed decision-making (Ajah & Nweke, 2019). From consumer behavior patterns to market trends, big data analysis unveils previously hidden correlations and emerging opportunities. This heightened awareness empowers industries to make strategic choices grounded in empirical evidence, mitigating risks and optimizing outcomes. In sectors such as retail and finance, data-driven insights enable precision in understanding customer preferences and forecasting market shifts, ultimately shaping successful strategies.

Operational Efficiency and Process Optimization:

The integration of big data analytics facilitates the optimization of operational processes, delivering heightened efficiency and resource allocation. Through data-driven analysis, industries identify inefficiencies and bottlenecks that hinder productivity. This leads to targeted process improvements and streamlined workflows, translating into resource and cost savings. Moreover, real-time data feeds enable agile adjustments, enabling swift responses to market fluctuations. Industries such as manufacturing and logistics reap substantial benefits, achieving seamless coordination and reduced wastage through data-informed process enhancement.

Personalized Customer Experiences:

Big data revolutionizes customer engagement by enabling hyper-personalization. By analyzing vast datasets comprising customer behavior, preferences, and transaction history, businesses can tailor offerings to individual needs (Shahzad et al. 2023). This personalization extends to tailored marketing campaigns, product recommendations, and service interactions, enhancing customer satisfaction and loyalty. In industries like e-commerce and telecommunications, personalized experiences not only foster customer retention but also amplify cross-selling and upselling opportunities, consequently elevating revenue streams.

Innovation and New Revenue Streams:

Big data serves as a catalyst for innovation, propelling industries to develop groundbreaking products and services. By decoding customer feedback, market trends, and emerging technologies, businesses gain insights that steer novel offerings. This innovation not only fosters market differentiation but also creates new revenue streams. Industries ranging from healthcare to entertainment tap into big data to identify gaps in the market and devise disruptive solutions. This adaptability to evolving consumer demands positions businesses as pioneers in their sectors.

Porter’s Value Chain Analysis

Porter's Value Chain Analysis is a strategic framework that helps organizations dissect their operations into distinct activities and examine how each activity contributes to the creation of value for customers and, consequently, the organization as a whole (Ngunjiri & Ragui, 2020).

Porter's Value Chain Components:

Now, applying this analysis to the impact of four big data business drivers - structured data, unstructured data, low latency data, and predictive analytics - can offer valuable insights into how these drivers influence various stages of the value chain.

Support Activities:

1. Firm Infrastructure: Big data impacts strategic decision-making. Structured data provides historical performance insights, guiding long-term planning. Unstructured data can uncover emerging market trends and competitive intelligence, influencing strategic initiatives.

2. Human Resources: Big data assists in talent management. Structured data aids in identifying skill gaps and training needs. Unstructured data, such as employee feedback and sentiment analysis, offers insights into employee satisfaction and engagement.

3. Technology: Technology plays a pivotal role in handling big data. The integration of structured and unstructured data requires robust IT infrastructure. Low latency data ensures real-time data processing and analysis capabilities, enhancing decision-making speed.

4. Procurement: Big data enhances procurement processes (Bag et al. 2020). Structured data supports supplier performance evaluation, aiding in supplier selection. Unstructured data assists in supplier risk assessment by analyzing external factors that may impact the supply chain.

Applying the Value Chain Analysis: To illustrate, let's consider a retail business. The impact of big data drivers can be observed across the value chain. Structured data aids in optimizing inventory management and supplier relationships in inbound logistics. Low latency data ensures real-time monitoring of stock levels and customer preferences in operations. Predictive analytics forecasts demand patterns in marketing and sales which can create tailored promotions and inventory adjustments. Post-sale service benefits from unstructured data insights into customer feedback which aids in improving customer satisfaction.

Porter’s Five Forces Analysis

1. Competitive Rivalry:

Big data drivers have a profound impact on competitive rivalry within an industry. Structured data enables companies to analyze market trends along with customer preferences and competitive benchmarks which fosters strategic differentiation (Suoniemi et al. 2020). Unstructured data can provide insights into brand perception and competitive positioning such as social media sentiment. Businesses can anticipate shifts in customer demands by leveraging predictive analytics which can enhance their ability to innovate and stay ahead of competitors. Low latency data ensures real-time decision-making that allows businesses to respond promptly to competitive moves.

2. Supplier Power:

The utilization of big data drivers can reshape the dynamics of supplier power. Structured data aids in supplier evaluation which facilitates data-driven negotiations and contract terms. Unstructured data provides insights into supplier reputations that helps businesses make informed decisions. Low latency data enhances supply chain visibility which can reduce dependency on single suppliers (Singagerda, Fauzan & Desfiandi, 2022). Predictive analytics anticipates supplier performance and potential disruptions which allows proactive risk mitigation strategies.

3. Buyer Power:

Big data drivers impact buyer power by enabling businesses to tailor offerings to customer preferences. Structured data allows for customer segmentation and customized pricing strategies. Unstructured data offers insights into buyer sentiments that can influence marketing and product strategies. Predictive analytics helps forecast consumer demand which can allow businesses to adjust pricing and supply accordingly (Bharadiya, 2023). Low latency data ensures quick responses to changing buyer behaviors and preferences.

4. Threat of Substitution:

Big data drivers can influence the threat of substitution by enhancing customer loyalty. Structured data-driven insights enable businesses to create personalized experiences that are difficult for substitutes to replicate (Sjödin et al. 2021). Unstructured data offers insights into customer feedback and preferences which can provide support for continuous improvement and product differentiation. Predictive analytics anticipates customer needs in order to reduce the likelihood of customers seeking alternatives. Low latency data ensures quick adaptation to market shifts that can reduce the window of opportunity for substitutes.

5. Threat of New Entrants:

The incorporation of big data drivers can impact the threat of new entrants by raising barriers to entry. Structured data enables established businesses to capitalize on economies of scale and create efficient operations which makes it challenging for newcomers to compete. Unstructured data provides insights into customer preferences to support brand loyalty. Predictive analytics helps incumbents anticipate market trends which enable preemptive strategies against new entrants. Low latency data facilitates real-time responses to emerging threats which can reduce the vulnerability of established players.


The integration of big data drivers into business strategies represents a pivotal juncture in the ongoing digital transformation. The confluence of structured and unstructured data along with the power of low-latency data and predictive analytics can alters the fundamental fabric of industries. From optimizing processes to driving innovation, big data's imprint is visible across the value chain and competitive dynamics. As organizations harness this potential, they position themselves to thrive in an era where data-driven insights are the cornerstone of informed decision-making and sustainable growth. By embracing big data's capabilities, businesses are poised to navigate challenges, seize opportunities, and unlock the full spectrum of possibilities presented by the data-driven future. 

Part B

Dataset identification

The dataset includes several parameters which are related to the retail industry. The dataset focused on date-wise CPI and employment rate with the weekly holiday. The dataset can help to identify the consumer price index along with the employment rate in the retail industry and the impact of holidays on them. The dataset is openly available and consists of three data files in which the considered dataset is the ‘Featured data set’ (Kaggle, 2023). It can be identified as one of the most suitable datasets that have provided structured data in order to analyze different outcomes.

Metadata of The Chosen Dataset

The selected dataset pertains to the retail industry and encompasses parameters such as Store, Date, Temperature, Fuel_Price, and various MarkDown values (MarkDown1 to MarkDown5), along with CPI (Consumer Price Index), Unemployment rate, and IsHoliday indicator. This metadata provides crucial insights into the dataset's composition and relevance within the retail sector.

The "Store" parameter likely represents unique store identifiers, facilitating the segregation of data based on store locations. "Date" captures chronological information, potentially enabling the analysis of temporal trends and seasonality. "Temperature" and "Fuel_Price" suggest that weather conditions and fuel costs might influence retail performance, as these factors impact consumer behavior and purchasing patterns.

The "MarkDown" values could denote promotional discounts applied to products, aiding in assessing the impact of markdown strategies on sales. Parameters like CPI and Unemployment offer a macroeconomic context, possibly influencing consumer spending habits. The "IsHoliday" parameter indicates whether a given date corresponds to a holiday, offering insights into potential fluctuations in sales during holiday periods.

Business Opportunities Through The Chosen Dataset

The analytical findings indicating a lower average unemployment rate on holidays and a higher average Consumer Price Index (CPI) during holiday periods hold significant implications for the chosen industry. These insights unveil a range of strategic opportunities that the industry can capitalize on to drive growth, enhance customer experiences, and optimize its operations.

Figure 1: Consumer price index comparison
(Source: Author)

Increased Consumer Spending: The lower average unemployment rate on holidays suggests a potential uptick in consumer spending power during these periods. This provides a prime opportunity for the industry to design targeted marketing campaigns, exclusive offers, and attractive promotions. By aligning their product offerings and marketing strategies with consumers' improved financial situations, businesses can drive higher sales volumes and revenue.

Customized Product Assortments: The availability of higher disposable income on holidays opens the door to curating specialized product assortments. Retailers can introduce premium and luxury items, cater to aspirational purchases, and offer exclusive collections that cater to elevated consumer spending capacity. This approach enhances the perceived value of products and creates a unique shopping experience.

Figure 2: Unemployment rate comparison
(Source: Author)

Strategic Inventory Management: Capitalizing on the lower unemployment rate on holidays can drive retailers to anticipate increased foot traffic and online orders. This presents an opportunity for strategic inventory management. Businesses can optimize stock levels, ensure the availability of popular products, and align staffing resources to accommodate higher consumer demand, ultimately enhancing customer satisfaction.

Enhanced Customer Engagement: With a heightened CPI during holidays, businesses can strategically invest in enhancing customer experiences to match the anticipated premium pricing. This could involve personalized shopping assistance, concierge services, or engaging in-store events. Elevated customer engagement fosters brand loyalty and differentiates the business in a competitive market.

Dynamic Pricing Strategies: The observed correlation between higher CPI and holidays enables the adoption of dynamic pricing strategies. By leveraging these insights, the industry can implement flexible pricing models that respond to demand fluctuations. This approach optimizes revenue generation while maintaining alignment with consumer expectations and market trends.



Read More

 DATA4100 Data Visualisation Software Report 4 Sample

Your Task

This written report with a dashboard is to be created individually.

• Given a business problem and data, finalise visualisations and prepare a report for the Australian Department of Foreign Affairs and Trade.

• On Tuesday of week 13 at or before 23:55 AEST submit your written report as a Microsoft Word file with a snapshot of your dashboard via Turnitin. This assessment covers Learning outcomes: LO2, LO3

Assessment Description

Should Australia enter a free trade agreement with Germany?

Business Background:

Germany, Japan, South Korea, United States, France and China are amongst the main exporters of cars. Suppose that the Australian government is particularly interested in the products exported from Germany, as well as Australian products exported to Germany, in considering the possibility of a free trade agreement.

Suppose that you have been asked, as an analyst for the Australian Department of Foreign Affairs and Trade, to report on exports from Germany, and in particular, the types of products Germany exports to Australia. Likewise, analyse the products that Australia exports to Germany currently, based on your own research into available data sets.

Your written report (to be prepared in this assessment - in Assessment 4) will ultimately end up in the hands of the minister for trade and investment, so any final decisions made should be supported by data In Assessment 4, you are to finish designing your visualisations, then prepare a report by interpreting the visualisations and integrating with theory from this subject.

Data set

- Use the data given to you in week 11

Assessment Instructions

- As an individual, finish the visualisations for your report.

- Write a structured report with appropriate sections, as follows:

- Introduce the business problem and content of your report. (150 words)

- Interpret your charts, summaries, clustering and any other analyses you have done in the process of creating your visualisations, and link it back to the business problem of whether Australia should enter a free trade agreement with Germany? (800 words)

- Justify the design of your visualisations in terms of what you have learnt about cognitive load and pre-attentive attributes and Tufte’s principles. (250 words)

- On Tuesday of week 13 at or before 23:55 AEST submit your report as a Microsoft Word file, containing your visualisations, via Turnitin.



This report is based on analysis and visualisation for business export between two countries Australia and Germany. The business problem is based on exports from Australia to Germany and exports from Germany to Australia. These exports include animal base products such as animal itself and meats. For Assignment Help, The problem is to analyse and visualise the data provided for business export Australia to Germany and Germany to Australia. The purpose of this report is to provide understanding and knowledge regarding total trade value and product type between these two countries Australian Germany so that Australian government can take decisions based on product exports or import between these two countries. In this report visualisations are represented for both Australia to Germany export and Germany to Australia export along with the prototype. Power bi is a business intelligence tool that is used for which and analysis on provided data. At the end of this report each visualisation are justified along with the attributes and important points are concluded. The data loaded in Power Bi for visualizations. The data cleaned by removing null values from the data. Cluster line chart created for the product export with trade value. Clustering done for product type and trade value by year

Data Gathering and Analysis

The data collected based on the import and export between these two countries Germany and Australia that includes product type product category and the total trade value made by each country on each individual product. The data uploaded in business intelligence tool to check the valuation of the data for the further analysis. It is important to validate the data for desired result and analysis that will further make easy decisions for the business problem. For the analysis and virtualization to different charts are used such as cluster line chart and clustering chart so that each attributes can be analyzed with the help of visualization.

Australia to Germany

This section discuss about analysis and visualisation regarding export from Australia to Germany as export involve multiple products that belongs to the animal product category along with the total trade value made on each product. Analysis made with the help of individual year considering the trade value for each year.

Cluster line and Bar Chart

The above chart showing the cluster line chart created with the help of trade value and the product category along with the year. The way you lieration created for total products exported by each year along with total trade value created on each product from Australia to Germany. As it can be seen in chat mineral products and chemical products are the highest one which made highest trade value while export from Australia to Germany. And further trade value is continuously decreasing along with the product category and the least state value created by the product weapons that source the minimum exports from Australia to Germany is done for weapons. This complete visualisation is based on three consecutive years from 2018 to 2020 only as the data represented is showing visualisation from 2018 to 2020.

Cluster Chart

The above visualisation showing cluster data visualisation for the product type exported between Australia to Germany each individual year. The colourful dots showing product type along with the trade value created by each product type from 2018 to 2020. Each products type is highlighted with a different colour for better identification along with some of the trade value. This graph also shows that mineral products have the highest trade value achieved while exporting from Australia to Germany.

Pie Chart

The above visualisation shoes pie chart for total trade value created by each product type exported from Australia to Germany. Here each product is represented with different colour along with trade value represented at outlyers of the pie chart. This representation can clearly defined the highest and lowest trade value made by the product in each individual year from 2018 to 2020 exports. Each different visualisation defines the product category product type and total trade value generated by each product while exporting from Australia to Germany

Germany to Australia

This section discuss about visualisation and analysis regarding export from Germany to Australia. Based on the provided data set it can be observed that there are multiple category of products which are exported from Germany to Australia from 2018 to 2020. The product category involve animal product vegetable products food stuffs and fruit along with minerals and chemical products. This section also represent three different visualizations that includes cluster line chart, cluster chart and pie chart.

Cluster line and Bar chart

The above graph shows cluster line chart for the export data from Germany to Australia that define products exported by each year with individual trade value generated by each product. As it can be seen in visualisation that transportation created the highest trade value from Germany to Australia. The second highest trade value from Germany to Australia created by machines export. While the least trade value generated by the product called animal and vegetables by products because it includes very less in export business between Australia and Germany. This graph also showing data from 2018 to 2020 for the export from Germany to Australia.

Cluster Chart

The above graph showing cluster chart that defined visualisation for the product type with trade value by each individual year from 2018 to 2020. Here each product type is represented with a different colour and each dot in above scatter plot defines product type along with the total trade value generated by each product from 2018 to 2020. The highest trade value and the least trade value can also be identified with the help of dots showing in above cluster visualisation. With the help of cluster chart each and every product type can be identified individually along with the accurate value.

Pie Chart

This is a pie chart that showing export of each product type along with trade value generated by each product. In this pie chart is product type is highlighted with different colour in order to categories each product trade value as the trade value highlighted at outlliers of the pie chart. Here it can be observed that transportation has the highest state value while export from Germany to Australia. Here transportation means product related to the transportation category such as vehicles so the major export involved from Germany to Australia is for vehicles. The second highest export made by Germany to Australia is for machinery and other mechanical products.


Based on the above visualisation and analysis it is found at Germany is a good exporter for machine related products such as vehicles and other machineries. In the same way it is also found that Australia is good in mineral products that is why Australia has created high trade value while exporting mineral products to Germany. Both countries have different experties regarding products and both are creating high trade value in each individual export is for the product export. The business value between Australia and Germany is identified high due to the heavy products exported and imported between these two countries. The purpose of analysis and visualization on export data between Australia and Germany has successfully completed.


Read More

DATA4000 Introduction to Business Analytics Report 3 Sample

Your Task

Consider below information regarding the National Australia Bank data breach. Read the case study carefully and using the resources listed, together with your own research, complete: Part A (Industry Report).

Assessment Description

Bank of Ireland


Bank of Ireland has been fined 463,000 by the Data Protection Commission for data breaches affecting more than 50,000 customers. It follows an inquiry into 22 personal data breach notifications that Bank of Ireland made to th Commission between 9 November 2018 and 27 June 2019. One of the data breach notifications affected 47,000 customers.

The breaches related to the corruption of information in the bank's data feed to the Central Credit Register (CCR), a centralised system that collects and securely stores information about loans. The incidents included unauthorised disclosures of customer personal data to the CCR and accidental alterations of customer personal data on the CCR”.


As an analyst within Bank of Ireland, you have been tasked with considering ways in which customer data can be used to further assist Bank of Ireland with its marketing campaigns. As a further task, you have been asked to consider how Bank of Ireland could potentially assist other vendors interested in the credit card history of its customers.

Assessment Instructions

Part A: Industry Report (1800 words, 25 marks) - Individual

Based on your own independent research, you are required to evaluate the implications of the European legislation such as GDPR on Bank of Ireland’s proposed analytics project and overall business model. Your report can be structured using the following headings:

Data Usability

- Benefits and costs of the database to its stakeholders.
- Descriptive, predictive and prescriptive applications of the data available and the data analytics software tools this would require.

Data Security and privacy

- Data security, privacy and accuracy issues associated with the use of the database in the way proposed in the brief.

Ethical Considerations

- The ethical considerations behind whether the customer has the option to opt in or opt out of having their data used and stored in the way proposed by the analytics brief

- Other ethical issues of gathering, maintaining and using the data in the way proposed above.

Artificial Intelligence

- How developments in AI intersects with data security, privacy and ethics, especially in light of your proposed analytics project.

It is a requirement to support each of the key points you make with references (both academic and “grey” material) Use the resources provided as well as your own research to assist with data collection and data privacy discussions.


Part A: Industry Report


The risk connected with the mortgages that the Bank of Ireland and other commercial organisations issue is managed via the application of data. For Assignment Help, Analysing the information they get about specific clients is how they accomplish things like client credit rating, payment card usage, balances owing on various payment cards, and balances owed on various kinds of credit (net loans capacity) can all be included in the dataset, although they are not the only ones. To determine a lender's creditworthiness or determine the hazard associated with loan issuing, credit security assessment is the study of past data (Shema 2019, p. 2). The research findings assist financial organisations and the Bank of Ireland in assessing both their own and their client's risks.

Data Usability

A person or group that might influence or be impacted by the information administration procedure is referred to as a participant in whatever data management program. The stakeholder database is used as more than just a device for public connections; it also acts as documentation for compliance and verification, a trustworthy source of data for future computer evaluations or studies, and fosters lengthy effectiveness. Stakeholder databases are essential, yet they are frequently underfunded, and numerous businesses continue to keep their data on unprotected worksheets (Campello, Gao, Qiu, & Zhang 2018, p 2). The average expense to design a database managing application is 24,000 dollars. Yet, the whole price ranges from 11,499 to 59,999 dollars. Any database administration application with fewer capabilities, or perhaps a Minimum viable product, would be less expensive than one that involves all of the anticipated functions.

Figure: Data usability
Source: (Hotz, et al, 2022)

An institution's daily activities regularly make utilization of descriptive data. Professional analyses that offer a historical overview of an institution's activities, such as stock, circulation, revenue, as well as income, all seem to be instances of descriptive data. Such reporting' material may be readily combined and utilized to provide operational glimpses of a company. Numerous phases in the descriptive analytical method may be made simpler by the use of corporate insight technologies including Power BI, Tableau, as well as Qlik.

Likelihoods are the foundation of predictive data analysis. Predictive modelling makes an effort to anticipate potential prospective results as well as the possibility of such occurrences utilizing a range of techniques, including data analysis, numerical modelling (arithmetical connections among factors to anticipate results), as well as optimization techniques for computer learning (categorization, stagnation, and grouping methods) (Lantz 2019, p 20). Among the best, most trustworthy, and most popular predictive analytic tools are IBM SPSS Statistical. It has existed for a while and provides a wide range of features, such as the SPSS modeller from the Statistics Framework for Behavioral Research.

Prescriptive data builds on the findings discovered via descriptive as well as predictive research by outlining the optimal potential plans of operation for a company. Because it is among the most difficult to complete and requires a high level of expertise in insights, this step of the corporate analytics method is hardly employed in regular corporate processes. Automating email is a clear example of prescriptive data in action. Marketers may send email content to each category of prospects separately by classifying prospects based on their goals, attitudes, and motivations. Email automation is the procedure in question.

Data Security and privacy

To safeguard database management systems from malicious intrusions and illegal usage, a broad variety of solutions are used in database security. Information security solutions are designed to defend from the abuse, loss, and intrusion of not just the data stored within the network but also the foundation for data management in general and any users (Asante et al. 2021, p 6). The term "database security" refers to a variety of techniques, methods, and technologies that provide confidentiality inside a database structure. Database security refers to a set of guidelines, strategies, and procedures that develop and maintain the database's security, confidentiality, and dependability. Because it is the area where breaches occur most frequently, openness is the most important component of data security.
Infringements might be caused by a variety of programming flaws, incorrect setups, or habits of abuse or negligence. Nearly half of documented data thefts still include poor credentials, credential exchange, unintentional data deletion or distortion, as well as other unwelcome human activities as their root reason. Database governance software ensures the confidentiality and security of data by ensuring that only permitted individuals get access to it and by executing permission tests when the entrance to private data is sought. One of the data breach reports involving Bank of Ireland involved 47,000 clients. The data flow from the bank to the National Credits Record, a unified platform that gathers and safely maintains data on mortgages, was compromised in the incidents. Unauthorized client private information exposures to the CCR and unintentional changes to client private information upon that CCR were among the instances.

Figure: Data Security and privacy
Source: (Yang, Xiong, & Ren, 2020)

According to Shaik, Shaik, Mohammad, & Alomari (2018), the safeguarding of content that is kept in databases is referred to as database integrity. Businesses often maintain a variety of data within the system. They must employ safety methods like encrypted networks, antivirus software, safety encrypting, etcetera, to protect that crucial data. The safety of the system itself as well as the moral and regulatory ramifications of whatever information must be put upon that database in the first position were the two key concerns concerning database confidentiality. Additionally, the ethical obligation imposed on database protection experts to protect a database management structure must be taken into account.

Data consistency, which acts as the primary yardstick for information quality, is defined as data consistency with reality. The proper information must match the data that is required since more conformity converts into higher dependability. It suggests that the information is accurate, without mistakes, and from a reliable and consistent source. Since inaccurate data leads to inaccurate projections, data integrity is essential. If the anticipated outcomes are inaccurate, time, money, and assets are wasted. Accurate information enhances decision-making confidence, increases productivity and advertising, and reduces costs.

Ethical Considerations

According to Tsang (2019), conversations regarding how firms manage consumer data typically revolve around regulatory issues, such as risks and constraints. With good reason: the massive private data collections made by businesses and government agencies entail severe consequences and the potential for harm. In a way, more current security regulations, including the General Data Protection Regulations (GDPR) of the European Union and the Consumers Privacy Act of California (CCPA), prohibit usage attempts to regain the user's power.

The best way for a business to convince consumers to give their consent for the collection and use of their private details is to use that data to the customer's benefit. Letting users understand what data companies gather about them and the ways it's used in company services or offerings. Every business with clients or users is providing a valued offering or service. The worth is sometimes rather clear-cut. Users of location tracking, for example, are likely aware that these apps must track user locations to show the participant's true location, alter turn-by-turn directions, or provide actual-time traffic data. Most users agree that utilizing up-to-date mapping information offers benefits over employing monitoring programs that can keep track of their locations (Trivedi, & Vasisht, 2020, p 77). In similar circumstances, businesses would have to convince clients of the benefit of their information consumption to win their support. Users are conscious of the barter as well as, in some cases, are willing to permit the utilization of personal data if it is used by a company to improve the value of its services, promote research and development, improve stock control, or for any other legitimate purpose. When businesses give clients a compelling cause to express their support, everyone wins. This requires gaining the client's trust through both behaviour and information.

Companies have an ethical responsibility to their customers to only collect the necessary material, to secure that information effectively, limit its dissemination, and also to correct any errors in relevant data. Employees have a moral duty to hold off on glancing at customer records or files until it is essential, to hold off on giving customer data to competitors, and to hold off on giving user details to friends or relatives. Customers who share data with companies they do business with also have an ethical responsibility in this respect (Kim, Yin, & Lee 2020, p 2). Such compliance might comprise providing accurate and complete data as needed as well as abiding by the prohibition on disclosing or using company data that individuals may have access to.

Artificial Intelligence

With the advent of technical advancement, multiple new and updated machines are used in several sectors across the globe. Financial sectors are one of the most growing and continuously changing sectors which requires an in-depth analysis of its internal changing faculties that takes place rapidly. According to Kaur, Sahdev, Sharma, & Siddiqui, (2020), the role of Artificial intelligence is enormous in securing the growth and development of the financial sectors. The Bank of Ireland has been providing satisfactory customer services for years. However, in recent times, some difficulties are generated in banking services due to questions regarding protecting the data of the customers and restricting the bank authority from any kind of malpractice of the data. In this regard, the role of artificial intelligence is crucial to bring a massive transformation in the data safety and security process and win the hearts of customers. Artificial intelligence works for enhancing cybersecurity and protecting the bank from money laundering (Manser Payne, Dahl, & Peltier 2021, p. 15(2). In recent times, a large number of banks are now focusing on the implementation of Artificial intelligence to ensure the safety and security of their data of customers. However, now the areas which require more emphasis are understanding how artificial intelligence works for protecting data and what steps can be implemented to harness the safety of data.

Artificial intelligence generally helps in future predictions based on previous activities of the customers and is significantly able to differentiate between the more important and least important data. With the help of cognitive process automation, multiple features can be enabled most appropriately. According to Manser Payne, Peltier, & Barger (2021), scecuring ROI reduces the cost and ensures the quick processes of services at each step of bank services. In the finance sector, it is important to have a quick review of the financial activities of the customers. For human labour, it is quite a tough task. To make the procedure easy and harness the financial activities of banks takes help from the inbuilt automation process and robot automation process which denotes a high level of accuracy, lesser human-made errors, use of the cognitive system for making decisions and deviating valuable time to the optimum success of the financial sectors (Flavián, Pérez-Rueda, Belanche, & Casaló 2022, p. 7).


Figure: Use of AI in banks
Source: (Shambira, 2020)

The Bank of Ireland uses cloud storage to keep the data of the customers safe and protected. The prime goal of using AI in banks is to make the financial activities of the bank more efficient and customer driven. Address the issues more efficiently and adopt new methods to attract more customers. The Bank of Ireland is one of the most prominent banks in the country and they have to handle a wide range of data. Using optimum levels of AI technologies will help to bring more efficiency to the banking system.


To conclude, it can be stated that the Bank of Ireland has been providing services for many years and since the inception of the bank its prime duty is to provide safe and secure services to its customers. With the increasing pressure on customers and raising questions about data protection, the banking sectors are now focusing on utilising Artificial intelligence in banks which can provide maximum safety to the data of the customers and increase the number of customers.



Read More

MBIS4004 System Design Report Sample

Workshop Session 02

Activity 01:

Trapping a sample:

• Class will be broken in teams of 3-4 students using breakout rooms.

• Access your E-Book on page 174, read and discuss the case to answer (you have 30 min.):

• Each of you must access Discussion Board “Group X: Trapping a sample” to write your answers (you have 30 min.) - 1% Mark.

• It must be done within this given time, otherwise you won’t receive any mark.

Activity 02:


You are hired as a systems analyst by an organization that is planning to redesign its website. Consider the following situations and describe the most appropriate sampling method for each of them.

a. To gauge employee perception towards website redesign, you post a notice on the intranet portal asking the employees to post their opinions.

b. You design a customer survey to find out which website features they wish to see redesigned.

c. You seek customer opinions through forms available at three of the company’s 20 helpdesks.

Explain why the answer to each situation would vary.

• Class will be broken in teams of 3-4 students using breakout rooms.

• Read and discuss the case to answer (you have 30 min.):

• Each of you must access Discussion Board “Group X: Activity 2”


Activity 1

The classes are being segregated into teams of 3 to 4 students each. This division has been done through the process of breakout rooms and they also had been provided with E-books. Meanwhile, every group was provided with Discussion books and regular classes were also being taken so that the students are in touch with their subjects regularly. For Assignment Help, Meanwhile, marks were being strictly distributed by teachers based on metrics, and hence only qualified students were being provided with the degree. Meanwhile, since I was a very serious student so I managed to clear all the exams easily, and hence due to this I am now a qualified system analyst.

Q) Role of System Analyst in Designing Website

Rahmawati et al. (2022) stated that system analysts have some critical challenges from the elicitation of requirements to the delivery of the technical requirements to the development teams so far. The system analyst always tends to look at the design more technically and functionally and human-computer interaction manages it through computer interaction. Sam Pelt is required to rely on software for sampling the opinion of customers and for making the strategic decision of stocking fake furs which have been always real for storing furs. Sam pelt is required to have a separate website for their company as websites have become the most important portal of communication. The business environment is extremely competitive and hence the development of the website has become mandatory.

Q) Designing Customer Survey

The system analyst always tends to serve to optimize user activity with systems and software for employees of an organization to work perfectly on it. Ninci et al. (2021) stated that these professionals always advise employees on which software they are required for implementing, and users are required to ensure for ensuring that the programs function correctly. Therefore, the system analyst employed by SamPelt is required to optimize the system and software so that the organization can perform effectively. Therefore, as a system analyst, I am required to ensure that the computer system, infrastructure, and systems perform effectively. Therefore, I carry the responsibility of researching the problem and finding solutions, and even recommending courses of action. The analyst of the system is required to be conversant in several operating systems, programming languages, hardware platforms, and software.

Q) Role of Customer Opinions in Designing Website

A system analyst is an individual who engages in techniques of design and analysis of engaged systems in solving any problem of business. Gao et al. (2023) reviewed that the analyst of the system is required to keep up to date with modern innovations for improving productivity at every time for the organization. Therefore as s system analyst of Sam Pelt, my main role is to improve productivity at every time of organization. I am going to leave no stone unturned in ensuring to use of a networked computer that supports the packaged software for selecting the mailing list of customers. Moreover, SamPelt is also interested in making a strategic decision that affects the purchasing of goods. Hence, as a system analyst, I am required to play a key role in this step to ensure that Sam Pelt is successful in developing a website for the organization so that it can operate effectively without any hiccups.

Reference List

Read More

COIT20263 Information Security Management Report 1 Sample


This assessment task relates to Unit Learning Outcome 2 and must be done individually. In this assessment task, you will analyse the scenario given on page 3 and develop guidelines for the specified policy for the hospital given in the scenario.

Assessment Task

You are required to analyse the scenario given on page 3 and develop guidelines for an Issue-Specific Security Policy (ISSP) on the ‘Acceptable Encryption Policy’ for the organisation described in the scenario. You should ensure that you support the guidelines you prepare with references and justify why those guidelines are necessary.

Assessment 1 task contains two parts; part A is writing a report on the guidelines and part B is writing a reflection on the experience of completing the assessment task.

Part A: The report for the given scenario should include:

1. Executive Summary

2. Table of Contents

3. Discussion

a Statement of Purpose (Scope and applicability, definition of technology addresses, responsibilities)

b Acceptable ciphers and hash function requirements

c Key Generation, Key agreement, and Authentication

d Violations of Policy

e Policy Review and Modification

f Limitations of Liability

4. References

Please note that you might need to make some assumptions about the organisation in order to write this report. These assumptions should match the information in the case study and not contradict the objectives of the report. They should be incorporated in your report. To avoid loss of marks, do not make assumptions that are not relevant or contradictory, or will not be used in your report discussion.

Your discussion must be specific to the given case scenario and the discussion should be detailed with justification. Wherever appropriate please provide evidence of information (with proper referencing) to justify your argument.

Please refer to external resources as needed. Please use at least 5 relevant references.

Note: You must follow the Harvard citation and referencing guidelines when writing your report.

Part B: Your reflection on completing this assessment may include (the word limit for part B is 500 words):

• how you attempted the task, methods used,

• any hurdle faced and how those were solved

• what you have learnt

• if you are asked to do this again, would you take a different approach? Support your answer with justification.


Statement of Purpose

Scope and Applicability

The purpose of this report is to provide guidelines for the development and implementation of an Acceptable Encryption Policy for XYZ, a leading Australian private health insurance company. For Assignment Help, The policy will apply to all employees of the company, including full-time and part-time staff. The policy will apply to all data and information that the business processes, transmits, or stores, including client data, employee data, and confidential company information.

Definition of Technology Addresses

Encryption technology is a vital tool that enables companies to secure their data by converting it into a coded form that can only be accessed by authorized personnel. Encryption technology involves the use of algorithms and keys to transform data into a secure format. The policy will define the types of encryption technologies that are acceptable for use by the company, including symmetric key encryption and asymmetric key encryption. The policy will also define the key lengths and encryption algorithms that are acceptable for use by the company (Lv and Qiao 2020).


The policy will define the responsibilities of different roles and departments within the company. The Chief Information Security Officer (CISO) will be responsible for the overall management and implementation of the policy. The IT team at each site will be responsible for installing and maintaining the encryption software on their respective servers. The security team will be responsible for monitoring the encryption tools to ensure their effective use and report any potential security breaches. All employees will be responsible for following the policy guidelines and using encryption tools appropriately to secure the data they handle. The purpose of this report is to provide guidelines for the development and implementation of an Acceptable Encryption Policy for XYZ. The policy will define the scope of the policy, the definition of technology addresses, and the responsibilities of different roles and departments within the company. The next section of the report will discuss the objectives of the policy (Hajian et al. 2023).

Acceptable Ciphers and Hash Function Requirements:

Encryption is a key component of data security, and the use of effective ciphers and hash functions is critical to ensuring data protection. The Acceptable Encryption Policy for XYZ will define the acceptable ciphers and hash functions that can be used to secure data.


The policy will define the types of ciphers that are acceptable for use by the company. These ciphers will include both symmetric and asymmetric ciphers. Symmetric ciphers, such as Advanced Encryption Standard (AES), are widely used for securing data as they use only a single key to encrypt as well as decrypt data. Asymmetric ciphers, such as RSA, use two keys, a public key, and a private key, to encrypt and decrypt data. The policy will also define the key lengths that are acceptable for use with the different ciphers (Lv and Qiao 2020).

Hash Functions

Hash functions are used to transform data into a unique fixed-length code or hash value. This is an important aspect of data security because it allows data integrity to be confirmed by comparing the hash value of the original data to the hash value of the received data. The policy will define the acceptable hash functions that can be used to secure data. These hash functions will include Secure Hash Algorithm (SHA) and Message Digest Algorithm (MD).

The policy will ensure that the ciphers and hash functions used by the company are regularly reviewed to ensure that they are still effective against current threats. The policy will also ensure that the use of weaker ciphers or hash functions is not permitted, as these may be vulnerable to attacks.

The Acceptable Encryption Policy for XYZ will define the acceptable ciphers and hash functions that can be used to secure data. This section of the policy will ensure that the ciphers and hash functions used by the company are effective against current threats and that the use of weaker ciphers or hash functions is not permitted. The next section of the report will discuss the encryption key management requirements defined in the policy (Lv and Qiao 2020).

Key Generation, Key Agreement, and Authentication:

Key generation, key agreement, and authentication are critical components of encryption that ensure the security of data. The Acceptable Encryption Policy for XYZ will define the key generation, key agreement, and authentication requirements to ensure that data is protected effectively.

Key Generation:

The policy will define the key generation requirements for the ciphers used by the company. The policy will require that keys be generated using a secure random number generator and that the key length be appropriate for the cipher. The policy will also define the process for key generation and the use of key derivation functions.

Key Agreement:

The policy will define the key agreement requirements for the ciphers used by the company. The policy will require that key agreement be performed using a secure key exchange protocol, such as Diffie-Hellman key exchange. The policy will also define the key agreement process and the use of key agreement parameters.


The policy will define the authentication requirements for the ciphers used by the company. The policy will require that authentication be performed using a secure authentication protocol, such as Secure Remote Password (SRP) or Public Key Infrastructure (PKI). The policy will also define the authentication process and the use of authentication parameters.

The policy will ensure that the key generation, key agreement, and authentication requirements used by the company are regularly reviewed to ensure that they are still effective against current threats. The policy will also ensure that the use of weaker key generation, key agreement, or authentication methods is not permitted, as these may be vulnerable to attacks (Niu et al. 2019).

Violations of Policy

The Acceptable Encryption Policy for XYZ is a critical component of the organization's security program. Violations of this policy can have serious consequences for the organization, including loss of data, damage to the organization's reputation, and legal liability. The policy will define the consequences of violating the policy to ensure that all employees understand the importance of compliance.

The policy will define the penalties for non-compliance, which may include disciplinary action, termination of employment, and legal action. The policy will also define the process for reporting policy violations and the procedures for investigating and addressing violations.

It is important to note that violations of this policy are not limited to intentional actions. Accidental or unintentional violations can also have serious consequences for the organization. Therefore, the policy will also define the process for reporting accidental or unintentional violations and the procedures for addressing them.

The policy will also define the process for reviewing and updating the policy to ensure that it remains effective against current threats. Regular reviews of the policy will help to identify any gaps or weaknesses in the policy and ensure that the organization is prepared to address new threats. The Acceptable Encryption Policy for XYZ will define the consequences of violating the policy, the process for reporting policy violations, and the procedures for investigating and addressing violations. The policy will also define the process for reviewing and updating the policy to ensure that it remains effective against current threats. The final section of the report will provide a conclusion and recommendations for implementing the policy (Niu et al. 2019).

Policy Review and Modification:

The Acceptable Encryption Policy for XYZ is a living document that must be reviewed and updated regularly to remain effective against new and emerging threats. The policy review process should be documented and conducted on a regular basis, with a goal of ensuring that the policy is up-to-date and relevant.
The policy review process should include an evaluation of the organization's security posture, as well as a review of current threats and trends in the industry. This evaluation should identify any weaknesses in the current policy, as well as any new technologies or encryption algorithms that may need to be added to the policy.

The policy review process should also involve stakeholders from across the organization, including the IT department, security team, legal team, and executive management. These stakeholders can provide valuable insights into the effectiveness of the policy and identify any areas that may need to be strengthened or revised (Sun et al. 2020).

Once the policy review process is complete, any modifications or updates to the policy should be documented and communicated to all relevant stakeholders. This may include training sessions for employees, updated documentation and procedures, and updates to the organization's security controls and systems (Dixit et al. 2019).

It is also important to note that changes to the policy may require approval from executive management or legal counsel. Therefore, the policy review process should include a process for obtaining this approval and documenting it for future reference.

Limitations of Liability:

The Acceptable Encryption Policy for XYZ provides guidelines and requirements for the use of encryption technology within the organization. While the policy is designed to reduce the risk of data breaches and other security incidents, it is important to note that no security measure can provide 100% protection against all threats.

Therefore, the policy includes a section on limitations of liability that outlines the organization's position on liability in the event of a security incident. This section states that while the organization will make every effort to protect the confidentiality, integrity, and availability of its data, it cannot be held liable for any damages resulting from a security incident.

This section also includes information on the steps that the organization will take to respond to a security incident, including incident response procedures, notification requirements, and any other relevant information.

It is important to note that the limitations of liability section is not intended to absolve the organization of all responsibility for data security. Rather, it is intended to provide clarity on the organization's position in the event of a security incident and to ensure that all stakeholders are aware of their responsibilities and obligations.


The Acceptable Encryption Policy for XYZ provides guidelines and requirements for the use of encryption technology within the organization. The policy outlines acceptable ciphers and hash function requirements, key generation, key agreement, and authentication procedures, as well as guidelines for addressing violations of the policy.

The policy is intended to protect confidential data from unauthorised access, disclosure, and alteration, as well as to reduce the risk of security incidents. The policy also includes provisions for reviewing and updating the policy as needed to address changes in technology or security threats.



Read More

DATA4300 Data Security and Ethics Case Study 1 Sample

Assessment Description

You are being considered for a job as a compliance expert by an organization and charged with writing recommendations to its Board of Directors’ Data Ethics Committee to decide on:

A. Adopting new technology solution that addresses a business need, and

B. The opportunities and risks of this technology in terms of privacy, cybersecurity and ethics

Based on this recommendation you will be considered for a job at the company.

Your Task

• Choose a company as your personal case study. You must choose a company which starts with the same letter as the first letter your first or last name.

• Complete Part A and B below:

1. Part A (Case Study): Students are to write a 700-word case study and submit it as a Microsoft word file via Turnitin by Monday, Week 6 at 10:00am (AEST) (Before class)

Note: Completing Step 1 before Step 2 is crucial. If you have not submitted Step 1 in time for your in-class Step 2, you must notify your facilitator via email immediately to receive further instruction about your assessment status.

2. Part B (One-way interview): Students need to be present IN CLASS in Week 6 where the lecturer will take them through how to record a one-way interview based on their case study.

Assessment Instructions

PART A: Case Study (20 marks)

You are being considered for a job as a compliance expert by an organisation and charged with writing recommendations to its Board of Directors’ Data Ethics Committee to decide about:

a) Adopting a new technology solution that addresses a company need, and

b) The opportunities and risks of this technology in terms of privacy, cybersecurity, regulation and ethics and how this affects the viability of the technology.

Your answers to the above two questions will be presented in a case study which will be considered in your job application. See suggested structure below:


Chosen Company and New Technology Solution

The chosen organisation is Pinterest. Pinterest is a well-known American company which offers the users to share and save image, creative portfolios, and generates aesthetic ideas, as well as it also offers social media services to the designers enabling the discovering of ideas and images (Pinterest, 2020). For Assignment Help, The company acquires the data of millions of its users, which makes it vulnerable to data thefts. As a compliance expert, my recommendation for technology solution to the Board of Directors’ Data Ethics Committee of Pinterest is Artificial Intelligence. AI is a booming technology which has a potential of bringing strong changes to the company’s operations, security challenges, and management as well as it enhances the efficiency, improved decision-making, and elevate the customer experience. The use of Artificial Intelligence technology also presents ethical as well as legal challenges which has to be considered carefully.
Below are mentioned the key areas in which the company can enhance its performance-

• Improved decision making- More informed decisions may be made with the help of AI's ability to analyse vast amount of data and derive useful conclusions. A company's strategic choices, discovery of new prospects, and operational optimisation may all benefit from this (Velvetech, 2019).

• Enhanced customer experiences- AI may help businesses customize their communications with customers, provide more personalized suggestions, and generally elevate the quality of their customers' experiences. This has the potential to boost satisfaction and loyalty among existing customers.

• Better risk management- Due to AI's ability to assist Pinterest detect and prevent vulnerabilities like fraud and cyberattacks. This can help to protect the company's reputation and financial performance.

• Increased innovation- AI has the potential to boost innovation by assisting Pinterest in creating and refining new offerings and providing access to previously unexplored consumer segments. This has the potential to aid businesses in competing successfully and expanding their operations (Kleinings, 2023).

Opportunities and Risks of This Technology in Terms of Privacy, Cybersecurity, Regulation and Ethics

AI technology offers several opportunities to Pinterest in order to improve the operations and performance of the company, however it also comes with challenges and risks which has to be addressed in order to ensure its viability. Below are mentioned the key opportunities and risks associated with AI technology in terms of privacy, cybersecurity, regulation, and ethics.
Opportunities of AI

• Personalised user feeds- AI helps in personalising and customising the user’s search recommendations and their feed on the basis of their search history, firstly the technology will collect the data of the users and further run the algorithm which will analyse and set what the user’s preferences are.

• Chatbots availability for customer help 24*7- Artificial intelligence has allowed chatbots to advance to the point where they are difficult to differentiate between real people. In many cases, chatbots are preferable to human customer service representatives. These bots can respond instantly to questions, provide faster service with fewer mistakes, and boost customer engagement (Kleinings, 2023).

• Customer relationship management - In addition to being an effective tool for sales teams, customer relationship management systems represent a significant commercial breakthrough. Despite the mixed results of previous CRM and sales force automation initiatives. Artificial intelligence (AI) has the potential to improve business operations and the quality of service provided to consumers in many ways.

• Intrusion detection- Most cyber defences today are reactive rather than proactive, but AI is helping to change that. By using AI to establish a standard for acceptable network behavior, businesses can better spot irregularities in traffic that may indicate the presence of malicious individuals (Qasmi, 2020).
Risks of AI

• Privacy concerns- Concerns concerning privacy have arisen due to the fact that AI technology gathers and analyses massive volumes of data. In order to secure customer information and remain in compliance with privacy laws, businesses must take the necessary precautions (Thomas, 2019).

• Cybersecurity risks- Artificial intelligence (AI) systems may be vulnerable to cyber dangers like hacking and data leaks. Companies must take strong cybersecurity precautions to guard against these dangers.

• Regulatory challenges- Issues with regulations Businesses have when using AI technology include having to adhere to a wide range of regulations. There may be financial penalties, legal action, and harm to Pinterest's reputation if Pinterest's don't follow these rules (Murillo, 2022).

• Ethical considerations- Issues of justice and equality can come up in the framework of AI usage, including issues of prejudice and discrimination.

• Legal issues- Concerns about legal responsibility arise when AI is used to make judgements that have far-reaching effects on people or corporations (Murillo, 2022).

• Lack of transparency- Decisions made by AI may be less transparent, making it more challenging for people and groups to grasp the reasoning behind them (Thomas, 2019).


Read More

DATA4500 Social Media Analytics Report 3 Sample

Your Assessment

• This assessment is to be done individually.

• Students are to write a 1,500-word report about Influencers and Social Media Markers and submit it as a Microsoft Word file via the Turnitin portal at the end of Week 10.

• You will receive marks for content, appropriate structure, and referencing.

Assessment Description

• You are the Digital Marketing Officer in charge of picking a social media influencer to lead an Extensive campaign as the face of your organization.

• As part of your research and decision-making process, you must gather and analyse more than just average likes and comments per post.

• Some of the statistics you will need to gather and assess are (only as an example):

o Follower reach.

o Audience type (real people, influencers, and non-engaging).

o Demographics.

o Likes to comments ratio.

o Brand mentions.

o Engagement rates for social media accounts.

o How data into competitors’ use of influencers can be measured to generate insights.

Assessment Instructions

• You have been asked to write a report on your options and choice, the criteria you used, and any tool that will support your work in the future.

• Some of the information you are expected to cover in your report is:

o What is the audience-type composition?

o What is an engagement rate, and how should advertisers treat this statistic?

o When is an engagement considered an authentic engagement?

o Why should we care about the followings of followers?

o How does our influencer ROI compare against that of our competitors?

• Your report should include the following:

o Cover.

o Table of Contents (see template).

o Executive Summary (3-4 paragraphs).

o Introduction.

o A section discussing social media analytics and its value to the business.

o A section on the role of the techniques taught in class, like sentiment analysis, competitive analysis, data mining, and influencer analysis.

o A section on how social media analytics was used to choose the influencer you recommend.

o A section recommending how your choice of influencer will be used as part of the organization’s marketing strategy.

o At least ten references in Harvard format (pick an additional five on your own besides five from the list below).



Utilizing social media sites to advertise something or provide something is known as social media marketing. In order to interact with target audiences, it involves creating and sharing content on social media platforms like Facebook, Twitter, Instagram, and LinkedIn. Social media influencers are people who have a significant following on the internet and are regarded as authorities in a certain industry. For Assignment Help, Brands can use them to advertise their goods or services to a wider demographic. In order to inform advertising strategies, social media analytics entails the measurement, analysis, and reporting of data from social media platforms. Businesses can employ it to better understand their target market, spot trends, and evaluate the effectiveness of their social media marketing strategies. Businesses may measure KPIs like engagement, reach, and conversions by using social media analytics tools to optimise their social media marketing efforts.
Social media analytics and its value to the Business

- Characteristics

The collection and analysis of data from social media platforms in order to inform marketing tactics is known as social media analytics. The following are some of the
key characteristics of social media analytics:

Real Time data: Virtual entertainment examination gives admittance to constant information, permitting advertisers to screen drifts and answer input rapidly.

Numerous metrics: The engagement, reach, impressions, and conversion rates of social media campaigns can all be tracked using a variety of metrics provided by social media analytics ( Enholm et al., 2022).

Customizable reports: Online entertainment examination apparatuses can be modified to create reports that meet explicit business needs, like following effort execution or breaking down client feeling.

Competitive analysis: Social media analytics may be used to keep tabs on rival activity, revealing market trends and spotting development prospects (Nenonen et al., 2019).

Data visualization: To assist managers in rapidly and simply understanding complicated data sets, social media analytics solutions frequently include data visualization techniques, such as charts and graphs.

Machine learning: Social media analytics increasingly uses machine learning methods to spot patterns and trends in data, allowing for more precise forecasts and suggestions for the next marketing plans.

- Its value in business

Businesses may benefit significantly from social media analytics by using it to make data-driven choices and improve their social media strategies. Following are some examples of how social media analytics may help businesses:

Audience insights: Social media analytics may give businesses information on the preferences, interests, and behaviors of their target audience, allowing them to develop more specialized and successful social media campaigns (Zamith et al., 2020).

Monitoring the success of social media initiatives: Social media analytics may be used to monitor the success of social media campaigns. This enables organizations to assess engagement, reach, and conversion rates and modify their strategy as necessary.

Competitive analysis: By using social media analytics to track rivals' social media activity, firms may keep contemporary on market trends and spot growth prospects.
Reputation management: Social media analytics may be used to track brand mentions and social media sentiment, enabling companies to address unpleasant comments and manage their online reputation (Aula and Mantere, 2020).

Measurement of ROI: Social media analytics may be used to assess the return on investment (ROI) of social media efforts, enabling companies to evaluate the efficacy of their social media plans and more efficiently deploy their resources.

Roles of the techniques like sentiment analysis, competitive analysis, data mining, and influencer analysis.

Businesses can use social media analytics to measure and improve their social media marketing strategies. Different web-based entertainment logical methods can be utilized to accomplish various goals. Here is a brief synopsis of each technique's function:

Sentiment analysis: The process of determining how a brand, product, or service is received in social media posts or comments is known as sentiment analysis. Natural language processing, or NLP, is used in this method to assess the positivity, negativity, or neutrality of text data. Monitoring a brand's reputation, determining trends in customer sentiment, and responding to negative feedback can all benefit from using sentiment analysis (Aula and Mantere, 2020).

Competitive analysis: Monitoring and analyzing competitors' social media activities is part of competitive analysis. This method can be utilized to recognize industry patterns, benchmark execution against contenders, and distinguish valuable open doors for development. Businesses can benefit from competitive analysis by staying ahead of the curve and making well-informed decisions regarding their social media marketing strategies (Jaiswal and Heliwal, 2022).

Mining data: The process of looking for patterns and trends in large datasets is known as data mining. Data mining can be utilized in social media analytics to discover customer preferences, behavior patterns, and interests. This strategy can assist organizations with making more designated web-based entertainment crusades and further develop commitment rates.

Influencer analysis: The process of identifying social media influencers with a large following and high engagement rate in a specific industry or niche is called "influencer analysis." This method can be utilized to recognize potential brand ministers and make a force to be reckoned with advertising efforts. Businesses can use influencer analysis to reach a wider audience and raise brand awareness (Vrontis et al., 2021).

Every one of these online entertainment scientific strategies plays a one-of-a-kind part in assisting organizations with accomplishing their web-based entertainment showcasing goals. By utilizing a blend of these procedures, organizations can acquire important experiences in their interest group, screen contender exercises, and upgrade their online entertainment methodologies for the greatest effect.

How social media analytics was used to choose the recommended Influencer

Online entertainment examination can be an amazing asset for recognizing web-based entertainment powerhouses that can assist brands with arriving at their interest group and accomplishing their promoting objectives. Social media analytics played a crucial role in the decision to select Elon Musk as a social media influencer.

- In the beginning, social media analytics tools were used to identify the tech industry's most influential individuals (Kauffmann et al., 2020). This involved looking at data from social media platforms like Twitter and LinkedIn to find people with a lot of followers, a lot of people engaging with them, and a lot of social media presence.

- To assess the influencers' overall sentiment and level of influence in the tech industry, the social media analytics team performed sentiment analysis on their social media posts. They additionally directed serious examination to think about the distinguished forces to be reckoned with's online entertainment execution against each other.

Elon Musk emerged as a leading social media influencer in the tech industry based on the insights gleaned from these social media analytics techniques (Ding et al., 2021). He was an ideal candidate for a partnership as a social media influencer with a tech company due to his large social media following, high engagement rates, and positive sentiment in the tech community.

Data mining methods were also used by the social media analytics team to gain a deeper understanding of Musk's social media habits and interests. This involved looking at his activity on social media to find patterns, preferences, and interests that could be used to design a successful social media marketing campaign.
Elon Musk was selected as a social media influencer in large part as a result of social media analytics. The team was able to determine the most prominent individuals in the tech sector, carry out the sentiment and competitive analyses, and obtain deeper insights into Musk's social media behaviour and interests by utilising a variety of social media analytics approaches. This made it possible to make sure that the influencer was the ideal match for the brand's marketing objectives and target market.

Recommending how your choice of influencer will be used as part of the organization’s marketing strategy

Using Elon Musk as a social media influencer can be a great way to reach a larger audience and spread awareness of your company or brand. He has a large social media following. With more than 1.4 million followers on Instagram and over 9 million on Twitter, Elon Musk has a large social media following. Most of his followers are young men between the ages of 18 and 24, making up the largest age group. With high rates of likes, comments, and shares on his social media postings, Musk's fanbase is also quite active. Musk's Twitter account has a high average engagement rate of 2.84%, which is much higher than the sector average of 0.45% in terms of engagement ratios. Along with having a high interaction rate, his Instagram account averages 1.5 million likes for every post. With regard to Musk's feelings, Musk is known for his eccentric methodology and cutting-edge vision, which frequently gets compelling profound reactions from his crowd. His web-based entertainment posts frequently create a blend of positive and gloomy feelings, with energy, interest, and motivation is the most widely recognized good feelings. Social media analytics tools can be used to track metrics like follower growth, engagement rates, and audience demographics for both the influencer and their competitors to gain insight into how competitors are using influencers. Additionally, these tools can be utilized to monitor brand mentions and sentiment across social media platforms. By doing so, businesses are able to acquire a deeper comprehension of how their audience views their rivals and how they can enhance their own social media strategy. By dissecting this information, associations can arrive at additional educated conclusions about how to use forces to be reckoned with and streamline their web-based entertainment system to remain in front of the opposition.

Use Elon Musk as a social media influencer as part of your company's marketing strategy in the following ways:

Collaborate with Musk for a web-based entertainment takeover: Permit Musk to assume control over your association's web-based entertainment represents a day, seven days, or a month (Hunsaker and Knowles, 2021) This will offer him the chance to elevate your image to his huge understanding, share his contemplations on your industry, and draw in with your crowd.

Work on social media campaigns with Musk: Create a social media campaign for your brand, product, or service with Musk's help. Sponsored posts, videos, and social media contests are all examples of this.

Influence Musk's online entertainment presence to create buzz: Share Musk's posts and content on your association's web-based entertainment records to use his enormous pursuit and produce a which around your image (Guan, 2022)

Run influencer marketing campaigns: Become a brand advocate for your company by working with Musk. It can entail producing a number of sponsored articles, videos, and other pieces of content to advertise your company, its goods, or its services.

Reach out to Musk's audience: Take advantage of Musk's social media following by interacting with his followers through comments, direct messaging, and other online exchanges (Milks, 2020). By doing this, you may strengthen your bonds with your supporters and draw in new clients for your business.
Using Elon Musk as a social media influencer can be a great way to reach a wider audience and generate buzz around your brand or organization. By partnering with Musk on social media campaigns, leveraging his massive following, and engaging with his audience, you can build your brand, attract new customers, and generate long-term growth.


This particular report is based on social media analytics and social media influencers. There are various characteristics of social media analytics are discussed. Then its value o importance in business is discussed. On the other hand, the role of different social media analytics techniques is analyzed. The types of analytics are sentiment analysis, competitive analysis, data mining, and influencer analysis is done. Then how social media analytics is used to choose Elon Musk as an influencer is discussed and then how Elon Musk as a social media influencer can impact business strategy.

Reference list

Read More

DATA4200 Data Acquisition and Management Report 2 Sample

Your Task

This report will enable you to practice your LO1 and LO2 skills.

• LO1: Evaluate ethical data acquisition and best practice about project initiation

• LO2: Evaluate options for storing, accessing, distributing, and updating data during the life of a project.

• Complete all parts below. Consider the rubric at the end of the assignment for guidance on structure and content.

• Submit the results as a Word file in Turnitin by the due date.


Unstructured data has typically been difficult to manage, since it has no predefined data model, is not always organised, may comprise multiple types. For example, data from thermostats, sensors, home electronic devices, cars, images, sounds and pdf files.

Given these characteristics, special collection, storage, and analysis methods, as well as software, have been created to take advantage of unstructured data.

Assessment Instructions

Given the considerations above, select one of the following industries for your assessment.

• Healthcare

• Retail - clothing

• Social Media

• Education

• Motor vehicles

• Fast Foods

1. Read relevant articles on the industry you have chosen.

2. Choose one application from that industry.

3. Introduce the industry and application, e.g., healthcare and image reconstruction.

4. Explain what sort of unstructured data could be used by an AI or Machine Learning algorithm in the area you chose.

a. Discuss best practice and options for

b. Accessing/collecting

c. Storing

d. Sharing

e. Documenting

f. and maintenance of the data

5. Propose a question that could be asked in relation to your unstructured data and what software might help you to run AI and answer the question.


Introduce the industry and application

The healthcare industry is made up of a variety of medical services, technologies, and professionals who work to improve people's health and well-being. It includes hospitals, clinics, pharmaceutical companies, manufacturers of medical devices, research institutions, and many more. For assignment help, The healthcare industry is always looking for new ways to improve patient care and outcomes in order to diagnose, treat, and prevent diseases.

One significant application inside the medical services industry is processing of medical images. The process of acquiring, analyzing, and interpreting medical images for the purposes of diagnosis and treatment is known as medical image processing. It is essential in a number of medical fields, including orthopedics, cardiology, neurology, oncology, and radiology. Medical images can be obtained through modalities like X-ray, computed tomography (CT), magnetic resonance imaging (MRI), ultrasound, and positron emission tomography (PET), and advanced algorithms and computer-based techniques are used in medical image processing to extract meaningful information from medical images. The human body's internal structures, organs, tissues, and physiological processes are depicted in great detail in these images. Patients and healthcare professionals alike can reap numerous advantages from using medical image processing. It empowers more precise and proficient determination by giving nitty gritty bits of knowledge into the presence, area, and attributes of anomalies or infections. Images can be analyzed by doctors to find conditions like tumors, fractures, and blocked blood vessels, which can help with treatment planning and monitoring (Diène et al., 2020). Medical image processing aids in the development of healthcare research and development. It makes it possible to create massive image databases for the purpose of training machine learning algorithms. This can help automate tasks related to image analysis, increase productivity, and cut down on human error. Besides, it supports the investigation of new imaging procedures, for example, useful X-ray or dissemination tensor imaging, which gives important bits of knowledge into cerebrum capabilities and brain network.

Discussion on sort of unstructured data could be used by an AI or Machine Learning algorithm in the processing of medical image

There are many different kinds of unstructured data that can be used in image processing. Information that does not adhere to a predetermined data model or organization is referred to as unstructured data. Here are a few instances of unstructured data utilized in the processing of medical image:

Image Pixels: Unstructured data is created from an image's raw pixel values. Algorithms can use the color information in each pixel, such as RGB values, to extract features or carry out tasks like image classification and object detection.

Metadata for Images: Metadata that accompanies images typically contains additional information about the image. The camera's make and model, exposure settings, GPS coordinates, timestamps, and other information may be included in this metadata (Galetsi et al., 2020). This information can be used by machine learning algorithms to improve image analysis, such as locating an image or adjusting for particular camera characteristics.

Figure 1: Machine learning for medical image processing
(Source: https://pubs.rsna.org)

Captions or descriptions of images: Human-created portrayals or subtitles related to pictures give text based settings that can be utilized in artificial intelligence calculations. For tasks like image search, content recommendation, or sentiment analysis, natural language processing techniques can analyze these descriptions and extract useful information.

Labels and annotations: Unstructured information can likewise incorporate manual comments or marks that are added to pictures by people. These annotations may indicate the presence of bounding boxes, semantic segmentation, regions of interest, or objects. AI calculations can involve this marked information for preparing and approval purposes, empowering assignments like article acknowledgment, semantic division, or picture restriction.

Image Content: Textual elements, such as signs, labels, or captions, can also be present in unstructured data contained within images (Panesar, 2019). Algorithms can process and analyze the textual information in these images thanks to the ability of optical character recognition (OCR) techniques to extract the text from the images.

Picture Setting: Unstructured data can be used to access information about an image's context, such as its source website, related images, or user interactions. Machine learning algorithms can improve content filtering, image comprehension and recommendation systems by taking the context into account.

Discuss Best Practice and Options

Accessing/collecting, Storing, Sharing, Documenting and maintenance of the data are very important for the healthcare industry. Here is the discussion on some options and practices related to these procedures in the healthcare industry and image processing.


Collection of healthcare data is important for the medical experts to provide better services to their patients. Here is the discussion of the options and practices related to this process.

Information Sources: Medical imaging archives, picture archiving and communication systems (PACS), wearable devices, and other relevant sources of healthcare data should be identified by the medical experts (Pandey et al., 2022). Team up with medical care suppliers and establishments are required to get close enough to the essential information.

Security and privacy of data: Stick to severe security and security conventions to safeguard delicate patient data can be taken as a best practice. Keeping patient confidentiality by adhering to laws like the Health Insurance Portability and Accountability Act (HIPAA) is an important part of the collection of healthcare data.
Qualitative Data: Examine the collected data for accuracy and quality. To address any inconsistencies, missing values, or errors that could hinder the performance of the image processing algorithms, there is a need to employ data cleaning and preprocessing methods.


Image processing depends on healthcare data being stored effectively by taking into account the following options and best practices:

Online storage: Use of safe cloud storage options are taken by the medical experts to store healthcare data. Scalability, accessibility, and backup capabilities are provided by cloud platforms. The medical experts try to carry out encryption and access controls to safeguard the put away information (Jyotiyana and Kesswani, 2020).

Information Lake/Store: Creation of a centralized data lake or repository is required to consolidate healthcare data for image processing. This considers simple recovery, sharing, and joint effort among specialists and medical care experts.

Formats and Standards: Stick to standard configurations like Advanced Imaging and Correspondences in Medication (DICOM) for clinical pictures and Wellbeing Level 7 (HL7) for clinical information is helpful to store the medical data and use them properly in image processing. This guarantees similarity and interoperability across various frameworks and works with information sharing and reconciliation.

Sharing Medical Information for Image Processing

Sharing medical services information is significant for cooperative exploration and working on quiet consideration. Think about the accompanying prescribed procedures:

Agreements for the Sharing of Data: A proper layout of information sharing arrangements or agreements that frame the terms, conditions, and limitations for information sharing are followed by the medical experts to share the essential data appropriately (Tchito Tchapga et al., 2021). This guarantees lawful and moral consistency, safeguarding patient security and licensed innovation privileges.

Techniques for De-Identification: Patient-specific information can be anonymized from the shared data using de-identification techniques while still remaining useful for image processing. Data can be shared in this way while privacy is maintained.

Transfer data safely: Encrypted channels and secure channels for data transferring are very much required to transfer the healthcare data. It helps to maintain confidentiality and prevent unauthorized access or interception because it can harm the treatment process. Safe transfer of data also helps the medical experts to improve their services and get better responses from the patients.


Healthcare data must be properly documented for long-term reproducibility and usability. Here is the discussion on some options and practices related to documentation of the healthcare data for image processing. Most of the time, medical experts are trying to catch and record thorough metadata related with medical services information, including patient socioeconomics, securing boundaries, and preprocessing steps. This data helps in grasping the unique circumstance and guaranteeing information discernibility (Willemink et al., 2020). Documentation of the healthcare data is very important and the medical experts try to do this in a proper way for providing better services to the patients.

Maintenance of the data

Version Management: Medical experts have tried to implement version control mechanisms to keep track of changes to the data, algorithms, or preprocessing methods over time. Reproducibility and comparison of results are made possible by this.

Governance of Data: Medical experts have tried to establish data governance policies and procedures to guarantee data integrity, accessibility, and compliance with regulatory requirements (Ahamed et al., 2023). They should check and update these policies on a regular basis to keep up with new technologies and best practices.

Healthcare data for image processing must be accessed, collected, stored, shared, documented, and maintained with careful consideration of privacy, security, data quality, interoperability, and compliance. Researchers and healthcare organizations can harness the power of healthcare data to advance medical imaging and patient care by adhering to best practices.

Propose a question that could be asked in relation to the unstructured data and what software might help to run AI and answer the question

Question: "How AI is used to improve lung cancer diagnosis accuracy by analyzing medical images from unstructured data?

Tensor Flow can be taken as software that might be helpful in running AI algorithms to answer this question. Tensor Flow is an open-source library broadly utilized for AI and profound learning undertakings, including picture handling. It gives an exhaustive system to building and preparing brain organizations, making it reasonable for creating computer based intelligence models to break down clinical pictures for cellular breakdown in the lungs location (Amalina et al., 2019). The extensive ecosystem and community support of Tensor Flow also make it possible to integrate other image processing libraries and tools, making it easier to create and implement accurate AI models for better healthcare diagnosis.


Read More

DATA6000 Industry Research Report 4 Sample

Assessment Description

In order to synthesise what you have learnt in your Analytics degree you need to submit an industry research report. This report needs to:

1. Outline a business industry problem that can be addressed through data analytics

2. Apply descriptive and predictive analytics techniques to the business problem

3. Provide recommendations addressing the business problem with reference to by data visualisations and outputs

4. Communicate these recommendations to a diverse audience made up of both analytics and business professionals within the report

Assessment Instructions

In your report please follow the below structure.

1. Executive Summary (100 words)

• Summary of business problem and data-driven recommendations

2. Industry Problem (500 words)

• Provide industry background

• Outline a contemporary business problem in this industry

• Argue why solving this problem is important to the industry

• Justify how data can be used to provide actionable insights and solutions

• Reflect on how the availability of data affected the business problem you eventually chose to address.

3. Data processing and management (300 words)

• Describe the data source and its relevance

• Outline the applicability of descriptive and predictive analytics techniques to this data in the context of the business problem

• Briefly describe how the data was cleansed, prepared and mined (provide one supporting file to demonstrate this process)

4. Data Analytics Methodology (400 words)

• Describe the data analytics methodology and your rationale for choosing it

• Provide an Appendix with additional detail of the methodology

5. Visualisation and Evaluation of Results (300 words not including visuals)

• Visualise descriptive and predictive analytics insights

• Evaluate the significance of the visuals for addressing the business problem

• Reflect on the efficacy of the techniques/software used

6. Recommendations (800 words)

• Provide recommendations to address the business problem with reference to data visualisations and outputs

• Effectively communicate the data insights to a diverse audience

• Reflect on the limitations of the data and analytics technique

• Evaluate the role of data analytics in addressing this business problem

• Suggest further data analytics techniques, technologies and plans which may address the business problem in the future

7. Data Ethics and Security (400 words)

• Outline the privacy, legal, security and ethical considerations relevant to the data analysis

• Reflect on the accuracy and transparency of your visualisations

• Recommend how data ethics needs to be considered if using further analytics technologies and data to address this business problem


Executive Summary

The business strategy works as a backbone which leads the business achieve desired goals leading towards profit and secures the future decision making in a competitive market. The airline industry serves many purposes and the problem arises in the industry of customer satisfaction affects most of them. For assignment help, The solution for the problem is to analyses the customer satisfaction rate by different services airline is offering to the passengers. The analysis will be conducted for the services offered by the airline business industry for their customers or passengers during travel to analyze the satisfaction rate which can outline the key components which are affecting their business and reason for the customer dissatisfaction rate.

Industry Problem

Airline industry consists of number of services during travel to the passengers where the services for customers are paid with the business partners. The services offered for the passengers as well as the cargo via different modes including jets, helicopters and airlines. The airlines is one of the known businesses in the travel industry which offers services to the passengers to use their spaces by renting out to the travelers.

Contemporary business problems

There are multiple challenges comes in the aviation industry which includes:

• Fuel Efficiency
• Global Economy
• Passenger satisfaction
• Airline infrastructure
• Global congestion
• Technological advancement
• Terrorism
• Climate change

Figure 1 Industry issues

These contemporary problems affect most in the travel industry specially for the airlines. The mostly faced business problem in the airline is the passenger satisfaction which affects the business most as compares to all other problems.

The airline enterprise has been an important piece of the British financial system for many centuries. Through innovation and invention, the British led to the sector in travel aviating in the course of the Industrial Revolution. Inventions which include the spinning jenny, water frame, along with water-powered spinning mill had been described as all British innovations.

The style and airline enterprise in England, Wales, and Scotland employs around 500,000 humans, made from 88,000 hired in the aviating unit, 62,000 in the wholesale unit, and 413,000 in the retail sector. There had been 34, that is 1/2 groups running within the UK style and airline area in the year 2020, throughout the services, transporting, and aviating sectors of the travel industry.

As the airline and transporting in the marketplace in UK keeps act as rebound, each production and intake of customers and passengers are starts thriving, the quantity of undesirable apparel is likewise soaring, and is turning into certainly considered one among the most important demanding situations for the environmental and financial sustainability within the UK.

According to the latest studies performed through UK grocery store chain Sainsbury’s, customers within the UK are anticipated to throw away round 680 million portions of garb this coming spring, because of updating their wardrobes for the brand new season in the aviation sector. Within the heap of undesirable apparel, an amazing inflation of 235 million apparel gadgets are anticipated to become in landfill, inflicting a massive terrible effect for the business environment (Ali et.al., 2020).

The survey additionally suggests that every UK client will eliminate a mean of nineteen apparel objects this year, out of which the seven might be thrown directly into the bin in United Kingdom. Over 49% of the human beings puzzled within the passengers are surveyed and believed tired or grimy apparel gadgets that cannot be donated for services, prompting the travelling and services enterprise to induce the general public to set them apart there from their used products for services offering regardless of the desired quality (Indra et.al., 2022).

Furthermore, one in six respondents that is claimed that they've inadequate time to be had or can not be troubled to the various type and recycle undesirable apparel gadgets, at the same time as 6% raise in the apparel demand in the market that can be recycled for the fresh start up of the lifes of travel industry. The industry is now indulging in the various effective activities in creating the elements through recycling of the cloth for the sustainability of the environment.

Airline services is turning into one in all the largest demanding situations for the environmental and financial sustainability across the world. The UK isn’t the most effective one; different nations also are notably contributing towards the issue – over 15 million tonnes of the passengers travelling is produced each year within the United States, at the same time as an large 9.35 million tonnes of passengers services are being landfilled within the European Union every year for the sustainability.

Data processing and management

Data Source

The data chosen for the exploratory data analysis on the airline industry is from Kaggle which consists of different airline services offered to the passengers including attributes:

Id, Gender, Customer type, age, class of travel, satisfaction and satisfaction rate which are the main attributes on which analyses is performed to analyses the passenger satisfaction rate towards the airline industry. The visualizations on the attributes are performed to describe the services passengers mostly liked during travel and the satisfaction rate they have provided to the services availed by them.

Figure 2 Airline industry dataset

Applicability of descriptive and predictive analytics

The descriptive and predictive analytics is done in order to provide better decisions for future by analyzing the past services. The descriptive analytics is done to describe the company positives and negatives happened in their services by which customer satisfaction rate is increased or decreased where the predictive analytics is totally based upon descriptive analytics to provide the potential future outcomes from the actions analyzed combining all the problems and finding a solution for the future in order to reduce the negatives and provide better future outcomes.

Data cleaning

The data processing was done by removing and dropping of the columns not required for the analysis. Data consists of some not required attributes which has no use in the analysis which are dropped. Further data cleaning was done by checking of the null values and filling of the space so that no noise can be raised during the analysis and visualizing the data attributes (Sezgen et.al., 2019). The data mining is done by extracting out all the necessary information of the services provided to the passengers by comparing them to the satisfaction sentiment provided by the passengers to predict the satisfaction rate on each and every service availed by them which makes it easy for the company to look for each and every service offered by them.

Data Analytics Methodology


Python is used in the analysis of the business industry problem of airline passenger satisfaction. Python is mostly used and known for managing and creating structures of data quickly and efficiently. There are multiple libraries in python which were used for effective, scalable data analytics methodology including


Pandas is used for reading different forms of data which is data manipulating library used for handling data and managing it in different ways. The pandas used in the data analytics to store, manage the airline data and perform data different operations upon it by processing and cleaning of data.


This library of python is used for extracting out all the data information in the form of plots and charts with the help of NumPy which is used to manage all the mathematical operations upon data to describe data in statistical manner and matplotlib presents all the operations using plots and charts.


This python library is also used to describe the data insights into different graphs and charts but in an interactive way using various colors and patterns upon the data which makes a data more attractive and easier to understand. These graphs generated are very attractive and can be used by businesses to describe as their efficiency in the business to the customers to travel with them (Noviantoro, and Huang, 2022).

The methodology details are further attached in the Appendix to describe in brief the methodology used for the data analytics and the predictions and calculations happened upon the data in descriptive and predictive analytics techniques using python programming language.

Visualization and Evaluation of Results

Results of the passenger satisfaction

The results of the analysis and visualization depicts the satisfaction as the binary classification where the dissatisfaction rate cannot be measured by neutral category by airline industry also measuring the aspects of the flight location, ticket price, missing in the data which can be a major aspect in analysis (Tahanisaz, 2020).

The results depict that airline provides increased satisfaction rate to the business travellers and passengers more as compared to the personal passengers. The services which are mostly disliked or the passengers were dissatisfied with were online booking and seat comfort which should be taken as priority by airline industry with the departure on time and the inflight services to tackle such issues as passengers appear to be the sensitive in aspects of such issues (Hayadi et.al., 2021).


Figure 3 Satisfaction results

Figure 4 Satisfaction by gender

Figure 5 Satisfaction by customer type

Figure 6 Satisfaction by services

Figure 7 Satisfaction by total score

Figure 8 Satisfaction by total score for personal travellers

Figure 9 Satisfaction by total score for business travellers

Figure 10 Data correlation heatmap

Significance of the visuals in business

Visuals depicts and communicate in a clear manner and defines the ideas to cost up the business and sort most of the business-related issues by analyzing and visualizing the data insights for future decision makings. Visuals manages the cost, time and customers for the business perspective.

Efficacy of Python Programming

The python programming language used for the visualization and analytics on the airline industry passenger satisfaction with the Jupyter notebook IDE and the Anaconda Framework. The python is very efficient in comparison to other analytics methods because it gives more efficient syntax as it is high level language and provides better methods to analyses and visualize data.


Ideally that is apparel that gains the maximizes high quality and minimises terrible environmental, social and financial affects along with its delivery and price chain. Airlines is sustainable does now no longer adversely affect the nature of purchasing behavior of human beings or the planet in its manufacturing, transport, retail or travel of lifestyles management in today's era.

A variety of realistic examples of the sustainable apparel are at the marketplace. These range within the degree of sustainability development they obtain that specialize in surroundings, honest alternate and hard work problems to various extents (Shah et.al, 2020). Some examples of movements to enhance the sustainability of apparel are: apparel crafted from licensed services food drinks, beverages, the use of organic and healthy food; departures that permit us to apply much less strength whilst services our customer satisfaction and are much less polluting; foods and drinks with the books for the passengers keep the use of much less strength that is garb reused at quit of existence on the second one hand market; cleanliness apparel recovered at give up of existence to be travel again into greater apparel; Fair Trade licensed online bookings allowing greater equitable buying and selling conditions, making sure hard work requirements are adhered to continue the exercise and stopping exploitation. Sustainability is critical due to the fact all of the selections that is pursued and all of the movements that make nowadays will have an effect on the entirety withinside the future or upcoming time. consequently interruption of the make sound selections at today's era so that it will keep away from restricting the selections of generations to come over here for the growth and development in the aviation sector. The motives for the environmental destruction are specifically because of populace ranges, intake, generation and the financial system. The trouble in considering the worldwide surroundings that has much less to do with populace increase in the demand than it does with stages of intake through the ones living in the airline industry (Gorzalczany et.al., 2021).

The courting among the inexperienced advertising and marketing creates the motion and client conduct is a vital subject matter to an extensive variety of the situational areas. Sustainability idea can't be finished with out related to the client. The key position of client behavior (and family client behavior in particular) in riding with the business or external environmental effect has long been recognized. In the end, it's miles the customers who dictate in which the marketplace will go to baggage handling the items. Passenger want and desires create a cycle of client demand and supply of the inflight services, business enterprise catering to that demand, and finally, stays for the client recognition with the acquisition of products within the online boarding services. The assessment of this look at ought to help in advertising and marketing efforts with the aid of using the green style strains and their information of client conduct. It may also help style airline businesses in figuring out whether or not or now no longer to provide an green line. The airline enterprise’s consciousness is one of the reasonably-priced productions and distribution of the services without giving an idea to its effect at the environment (Tsafarakis et.al., 2018).

Data Ethics and Security

Privacy, legal, security, and ethical considerations

The data of any business industry is taken under ethical measuremnts to secure the safety and privacy of the customers personal information. Considering privacy,s ecurity and legal issues data access is the major thing to be consider which provides freedom for the business to use the data for their requirements but the unauthorized access to the data and information may cause harm to business as well as the privacy of the customers and clients in business industry (North et.al., 2019).

Accuracy and transparency of visualizations

The visualization made accurately by applying machine learning models training on the data of the airline inudtry which makes sure to analyse data accurately and efficiently by describing the accurate data insights through visuals.

Ethics in adddressing future business problem

Set of designs and practices upon data regarding solving business issues can be used with the ethical principles to use data with confidentiality which do not harm the privacy of the customers and individuals and results in a way which is communicable by everyone to connect with the data insights and visuals with consistency.


Read More

DATA4300 Data Security and Ethics Report 3 Sample

Your Task

• Write a 1,500-word report and record a video reflecting on the report creation process (involving an initial ChatGPT-generated draft, then editing it to a final version) and lessons learned.

• Submit both as a Microsoft Word file in Turnitin and a Kaltura video file upload by Wednesday, Week 13, at 23:55 pm (AEST):

Assessment Description

You have been employed as a Data Ethics Officer by a medical industry board (professional body). The board asked you to write a Code of Conduct about using new technologies in healthcare, which should be kept up to date and reflect technological evolution. Because of this technology’s breakneck pace, the professional body wants to ensure a code of conduct review happens more regularly and has authorised you to experiment with AI-powered chatbots to create a faster update process.
You are to write a Code of Conduct focused on using new technologies to inform best practices in the healthcare industry. For this assessment, you will choose one of the following technologies:

a) A device that tracks and measures personal health indicators, or

b) An application that recommends mental health support therapies.

Inform your lecturer about your technology choice by Week 9

Assessment Instructions

You will be asked to produce a report and video for this assessment.


• You are to start by conducting your own research (Introduction and Considerations sections, see structure below) on your technology.

• You will then create a code of conduct draft generated by ChatGPT. Then, you will edit it to make it applicable to your chosen technology, compliant with current regulations, and meaningful to the medical board request. Your individual research will inform this.

• Your Code of Conduct will be presented in a report (a suggested structure is below). Add at least five original Harvard references and use citations to them in your report.



The increasing adoption of new technology is having a revolutionary effect on the healthcare sector. One of the most talked-about new developments is the appearance of tools for monitoring one's own health statistics. For Assignment Help, These cutting-edge tools provide people a way to track their vital signs and collect useful information about their health in real time. Because it allows individuals to take an active role in their own health management, this technology has the potential to significantly alter the healthcare system.

Benefits of a device that tracks and measures personal health indicators

There are several benefits that a device that tracks and measures personal health indicators can provide, as mentioned below-

• Health monitoring and tracking process- Monitoring health indicators like heart rate, blood pressure, sleep, and activity levels help people keep tabs on their progress towards healthier lifestyles. Patients may evaluate their health improvement over time, which can boost their drive and self-awareness (Han, 2020).

• Improved diagnostics- The device helps people achieve their health goals by offering them with unique insights and suggestions based on their own health data. This encourages individuals to take charge of their health by making educated decisions about their lifestyle and taking preventative measures.

• Achieving Health Goals- The gadget helps create goals by delivering personalised health data insights and suggestions. This helps patients to make educated lifestyle choices and take proactive health activities (KORE, 2020).

Figure 1: benefits of health monitoring device
(Source: Scalefocus, 2022)

Privacy, Cybersecurity, and Data Ethics Concerns

The devices stored and using the patients also come with a few security issues as well-

• There is a risk of cyber threats and unauthorized access to sensitive health data.

• The healthcare department might use the data without any consent leading to a breach of privacy.

• The GDPR laws and other regulations related to data can be breached and data can be used to carry out cyber thefts.

Considerations on Regulatory Compliance, Patient Protection, and Risks

Cybersecurity, privacy, and ethical risks associated with a device that tracks and measures personal health indicators

Cybersecurity risks

• Data Breaches- The gadget may be hacked, exposing sensitive health data.

• Malware and viruses- Malware or viruses in the device's software or applications might compromise data security (Staynings, 2023).

• Lack of Encryption- Weak encryption may reveal sensitive health data during transmission or storage.

Privacy risks

• Unauthorised Data Sharing- Health data may be shared without permission, jeopardising privacy (Staynings, 2023).

• Insufficient Consent Procedures- Users may not completely grasp data gathering and sharing, resulting in partial or misinformed consent.

• Re-identification- Anonymized data may be re-identified, violating privacy.

Ethical risks

• Informed Consent- If users are not educated about the purpose, risks, and possible repercussions of data collection and usage, obtaining real informed consent might be difficult.

• Data Accuracy and Interpretation- Data collection or analysis errors or biases may lead to erroneous interpretations and improper health recommendations or actions (Healthcare, 2021).

Regulatory compliance issues and patient protection requirements

The key regulatory complaints and laws for the data and privacy protection of the patients being used via devices of the medical industry are as mentioned below, these laws and regulations are for the data protection and ensure customer safety.

• Health Insurance Portability and Accountability Act (HIPAA) - HIPAA compliance includes privacy and security requirements, breach reporting, and enforcement to safeguard healthcare system information. The HIPAA Privacy Rule applies to all healthcare professionals and covers all media types, including electronic, print, and oral. It gives people the right to see their own protected health information (PHI) and mandates that the information be disclosed as to how it is used (RiskOptics, 2022).

Figure 2: HIPAA
(Source: Splunk, 2023)

• Patient Safety and Quality Improvement Act (PSQIA) - The purpose of this regulation is to promote a culture of safety by offering peer review evaluations for information provided on healthcare mishaps. To prevent the information from being utilised in litigation against the PSO, the statute created new patient safety organisations (PSOs) (Carter Jr et al., 2022).

• Security Standards - Healthcare organisations must follow industry best practises and data security standards, such as encryption, access restrictions, and vulnerability monitoring. Standards like as ISO 27001 aim to assure the security, integrity, and availability of patient data (Cole, 2022).

• Incident Response and Breach Reporting - Organisations must have strong incident response procedures in place to deal with data breaches or security issues as soon as possible. They must also follow breach reporting standards, informing impacted persons and appropriate authorities within the times indicated (Healthcare Compliance, 2023).

Figure 3: regulatory compliances
(Source: MCN Healthcare, 2018)


Code of Conduct for A Device that Tracks and Measures Personal Health Indicators

Privacy and Data Protection

1.1. Data Collection and Use: Ensure that the collection and use of personal health data by the device are transparent and conducted with the explicit consent of the individual. Clearly communicate the purpose of data collection, how the data will be used, and any third parties involved.

1.2. Data Security: Implement robust security measures to protect personal health data from unauthorized access, loss, or disclosure. This includes encryption, secure storage, access controls, and regular security assessments to identify and address vulnerabilities.

1.3. Data Retention: Establish clear guidelines on the retention period of personal health data. Retain data only for as long as necessary and securely dispose of it once no longer needed, following applicable legal and regulatory requirements.

1.4. Anonymization and De-identification: When feasible, provide options for anonymizing or de-identifying personal health data to protect individual privacy. Ensure that any re-identification risks are minimized to maintain confidentiality.

Informed Consent and User Empowerment

2.1. Informed Consent: Obtain informed consent from individuals before collecting their personal health data. Clearly explain the purpose, benefits, risks, and limitations of data collection, enabling individuals to make informed decisions about using the device.

2.2. User Control: Provide individuals with mechanisms to control the collection, use, and sharing of their personal health data. Allow users to easily access, review, and modify their data preferences and provide options for data sharing with healthcare professionals or researchers.

2.3. User Education: Promote user education and awareness about the device's functionalities, data collection practices, and privacy settings. Ensure that individuals understand how to use the device safely and responsibly, empowering them to make informed decisions about their health data.

Accuracy and Reliability

3.1. Data Accuracy: Strive for accuracy and reliability in the measurements and data generated by the device. Regularly calibrate and validate the device to ensure accurate and consistent results, minimizing potential inaccuracies that could impact health decisions.

3.2. Algorithm Transparency: Provide transparency regarding the algorithms used to process and interpret personal health data. Users should have access to information about how the device calculates metrics, enabling them to understand and interpret the results accurately.
Ethical Use of Data and Algorithms

4.1. Responsible Data Use: Use personal health data only for legitimate purposes related to the device's functionalities and in accordance with applicable laws and regulations. Avoid the use of personal health data for discriminatory or unethical practices.

4.2. Avoidance of Bias: Ensure that the device's algorithms are developed and tested to minimize bias and discrimination. Regularly evaluate and address any potential biases in data collection or algorithmic decision-making processes to ensure fair and equitable outcomes.

Transparency and User Communication

5.1. Transparency of Data Practices: Provide clear and accessible information about how personal health data is handled, stored, and shared. Clearly communicate the device's data practices, including any third-party partnerships or data sharing arrangements, to foster transparency and trust.

5.2. User Communication: Establish effective channels of communication with users, allowing them to raise concerns, ask questions, or provide feedback about the device and its data practices. Promptly address user inquiries and provide transparent and meaningful responses.

Compliance with Applicable Laws and Standards

6.1. Regulatory Compliance: Adhere to all relevant laws, regulations, and standards governing the collection, use, and protection of personal health data. Stay updated with evolving regulatory requirements and ensure ongoing compliance with data privacy and protection obligations.

6.2. Industry Standards: Align with industry best practices and standards for privacy, data security, and ethical use of personal health data. This includes adhering to guidelines such as the GDPR, HIPAA, ISO 27001, and other applicable frameworks.
Accountability and Governance

7.1. Accountability: Establish clear accountability and governance mechanisms for the responsible use of personal health data. Assign roles and responsibilities for data privacy, security, and ethical considerations. Conduct regular audits and assessments to ensure compliance and identify areas for improvement.

7.2. Continuous Improvement: Regularly review and update the device's functionalities, privacy policies, and data practices to reflect advancements in technology, evolving regulatory requirements, and user feedback. Continuously strive for enhanced privacy, security, and ethical standards.

By following this Code of Conduct, developers, manufacturers, and operators of devices that track and measure personal health indicators can ensure the ethical and responsible use of personal health data, fostering trust among users and promoting the benefits of these innovative technologies in healthcare.

Code of Conduct

The technology chosen for this study is A device that tracks and measures personal health indicators, the device will require data from the patients which needs to be collected with informed consent, and all the regulatory compliances and data protection laws must be followed and adhered by the companies and medical industry. This will help the patients to build trust with their information on the company and medical industry and misuse of information can be done.

The code of conducts that needs to be followed are as mentioned below-

1. Privacy and data protection

a. Data collection and its usage- The Collection and Use of Personal Information. Make that the device is upfront about collecting and using an individual's personal health data, and that the subject gives their informed permission before any data is collected or used. Explain why you need the information, what you plan to do with it, and who else could have access to it (Data Privacy Manager, 2023).

b. Ensure top data security- Protect sensitive health information from theft, loss, and misuse by using industry-standard security protocols. Encryption, safe archiving, access limits, and routine vulnerability scans are all part of this.

c. Data retention- Clear rules on how long health records should be kept should be set out. When data is no longer required, it should be safely deleted in accordance with legal and regulatory standards.

d. De-identification and anonymisation- To further safeguard individuals' privacy, health records should be anonymized or de-identified wherever possible. Maintaining anonymity requires taking all necessary precautions (Maher et al., 2019).

2. User empowerment and informed consent

a. Patient’s control or data owner control- Allow people to make decisions about how their health information is collected, used, and shared. Provide choices for data sharing with healthcare practitioners or researchers and make it easy for consumers to access, evaluate, and adjust their data preferences.

b. Informed consent- Obtain people' informed permission before collecting their personal health data. Individuals will be able to make educated choices regarding device use if the purpose, advantages, dangers, and restrictions of data gathering are made clear (Sim, 2019).

c. User education- Increase user knowledge of the device's features, data gathering methods, and privacy controls. Make sure people know how to use the gadget properly and securely so they can make educated choices based on their health information.

3. Accuracy and reliability

a. Data accuracy- The device's measurements and data should be as accurate and trustworthy as possible. It is important to regularly calibrate and test the equipment to provide reliable findings and reduce the risk of inaccurate data influencing medical choices (Morley et al., 2020).

b. Algorithm and transparency- Be open and honest about the algorithms you’re using to analyse and interpret patients’ health information. In order to correctly interpret the data, users need to know how the gadget arrives at its conclusions.

4. Ethical use of data and algorithms

a. Using data responsibly- Use sensitive patient information responsibly and in line with all rules and regulations pertaining to the device's intended. Protect people's health information from being used in unethical or discriminatory ways.

b. Avoidance of bias- Make sure the device's algorithms have been designed and validated to reduce the likelihood of bias and unfair treatment. If you want to be sure that your data gathering and algorithmic decision-making processes are producing fair and equitable results, you should examine them on a regular basis and fix any problems you find.

5. Transparency and user communication

a. Data practices of transparency- Give people easy-to-understand details on how their health data is used and shared. Foster openness and trust by making it easy for users to understand the device's data practises, including any third-party partnerships or data sharing agreements (Kelly et al., 2020).

b. User Communication- Users should be able to voice their concerns, ask questions, and provide suggestions concerning the device and its data practises via established lines of contact. Get back to customers as soon as possible, and do it in a way that is both clear and helpful (Deloitte, 2020).

6. Compliance with Applicable Laws and Standards

a. Following laws and regulatory compliances- Respect all rules and regulations regarding the handling of sensitive health information. Maintain continuing compliance with data privacy and protection duties by keeping abreast of changing regulatory standards (Icumed, 2022).

b. Industry Standards- Maintain privacy, protect sensitive information, and utilise patient health data ethically in accordance with industry standards. The General Data Protection Regulation, the Health Insurance Portability and Accountability Act, the Information Security Standard ISO/IEC 27001.

7. Governance and accountability of the data

a. Continues improvement- The device's features, privacy rules, and data practises should be reviewed and updated on a regular basis to account for technological developments, shifting legislative requirements, and user input. Maintain a constant drive to improve confidentiality, safety, and morality.

b. Accountability- Establish transparent governance and accountability procedures for the ethical management of individual health records. Determine who is responsible for what in terms of protecting sensitive information and following ethical guidelines. Maintain a regimen of frequent audits and evaluations to check for inconsistencies and locate problem spots (Icumed, 2022).

Figure 4: code of conduct
(Source: Author, 2023)


Read More

CBS131 Cybersecurity Principles Report 2 Sample

Assessment Task

Produce a 1500-word cybersecurity group report. Advise on how to assess the cybersecurity threats facing the banking industry and apply an incident response plan to remediate from such attacks.

Please refer to the Task Instructions below for details on how to complete this task.

Task Instructions

Section A: Group Work

1. Group Formation

• Form a group of a maximum of 3 members.

• Your group must be formed by the end of Module 5 (Week 5) and registered.

• To register your group, you are required to send your Learning Facilitator an email before the registration deadline.

• Send an email to your Learning Facilitator with“CBS131 Group Registration” in the subject line. In the body of the email, please list the names and student ID numbers of all the members of your group. Also attach your completed Group Contract (see below for more details).

• Please note that you will work with your group members for Assessments 2 and 3.

2. Group Contract

Please read the attached CBS131_Assessments 2 & 3_Group Contract.

This document outlines the rules and conditions each group has to follow for both assessments as well as the roles and responsibilities of each group member. The group contract accounts for 5% of the assessment grade, as indicated in the Assessment Rubric.

• For assessments where students are expected to work in groups, the workload must be shared equitably among all group members. Please refer to sections 6.1 and 6.2 of the TUA PL_AC_014: Student Conduct Policy.

• When submitting the group contract, you are reminded not to ‘recycle’ (self-plagiarise) contracts from other assessments. Sections on deliverables, timeline and expectations should be unique to each assessment or project. Self-plagiarism constitutes a breach of Academic Integrity and can lead to penalties to the assessment or subject.

• During Assessments 2 and 3, you should keep records of communication and drafts. Any serious concerns about an
individual group member’s contribution should be brought to the attention of your Learning Facilitator as soon as they occur or at least two weeks before the due date, whichever is earlier.

• If a student has been accused of not contributing equally or fairly to a group assessment, the student will be contacted by the Learning Facilitator and given three working days to respond to the allegation and provide supporting evidence. If there is no response within three working days of contact, the Learning Facilitator will determine an appropriate mark based on the evidence available. This may differ from the mark awarded to other group members and would reflect the individual student’s contribution in terms of the quantity and quality of work.

Section B: Analyse the case and develop the group report

1. Read the attached case scenario to understand the concepts being discussed in the case.

2. Address the following:

• Review your subject notes to establish the relevant area of investigation that applies to the case. Study any relevant readings that have been recommended in the case area in modules. Plan how you will structure your ideas for the attacks/risk analysis, and remediation.

• Identify the methodology used to launch the cyber-attack against the bank and address the cyber threat landscaping and challenges facing the banking domain.

• Appraise the cyber attack’s impact on the bank’s operation.

• Explain the necessary security measures required to combat cyber threats, describe the basic security framework that banks need to have in place to defend against cyber threats and describe relevant security technologies to protect against cyber-attacks.

• Describe the strategies undertaken by banking management to regain customer trust in the aftermath of the cyber-attack.

• You will be assessed on the justification and understanding of security methods in relation to cyber-attack methodology, impact of the cyber-attack on banking industries, and effective strategies that can be used to regain trust of its customers. The quality of your research will also be assessed as described in the Assessment Rubric section. You may include references relating to the case as well as non-academic references. You will need to follow the relevant standards and reference them. If you chose not to follow a standard, then a detailed explanation of why you have done this is required.

• The content of the outlined chapters/books and discussion with the lecturer in the Modules 1 to 4 should be reviewed. Further search in the library and/or internet about the relevant topic is encouraged.

3. Group member roles:

• Each member is responsible for researching/writing about two methods or strategies.

• All group members are responsible for editing and checking the references of the report at the end so it’s not one member’s sole responsibility.

4. The report should consist of the following structure:

• A title page with the subject code and name, assessment title, student name, student number and Learning Facilitator name.

• The introduction (approx. 150 words) should describe the purpose of the report. You will need to inform the reader of:

• a) Your area of research in relation to data breach attacks and its context

• b) The key concepts of cybersecurity you will be addressing and what the effects of a data breach are.

• The body of the report (approx. 1,200 words) will need to respond to the specific requirements of the case study. It is advised that you use the case study to assist you in structuring the security methods in relation to the attacks/risk analysis and remediation, cyber threat

• landscaping and challenges facing the banking domain, impact of cyber attacks on the organisation and its customers, necessary security measures required to combat cyberthreats and effective strategies that can be used to regain the trust of its customers.

• The conclusion (approx. 150 words) will need to summarise any findings or recommendations that the report puts forward regarding the concepts covered in the report.

5. Format of the report:

• The report should use the Arial or Calibri font in 11 point, be line spaced at 1.5 for ease of reading and have page numbers on the bottom of each page. If diagrams or tables are used, due attention should be given to pagination to avoid loss of meaning and continuity by unnecessarily splitting information over two pages. Diagrams must include the appropriate labelling in APA style.

Please refer to the Assessment Rubric for the assessment criteria.



As determined by cyber threat, the landscape includes an entire segment of cybersecurity affecting organisations, user groups and specific industries. The emergence of novel cyber threats daily changes its landscape accordingly. The threat landscape constitutes certain factors that pose a risk to every entity within a relevant context. The case study report has discussed the cyber threat landscaping faced by the banking sectors worldwide. For Assignment Help, The associated challenges to protect and maintain customer confidence, especially in the corporate domain, have also been discussed. The report has focused on data breaches as a strategy to carry out malicious activities by the actors and motivators of cybercrimes. An action data breach can significantly cause adverse effects for the parent organisation due to the mishandling of sensitive information resulting in identity theft (Benson, McAlaney & Frumkin, 2019). Hackers utilise such information to conduct malpractices in the form of new bank account opening or purchase actions.


Cyber threat Landscaping and challenges facing the banking Domain

The sole responsibility for sensitive data security management has been given to the national government and the respective banking body. The global financial system has been undergoing a digital transformation accelerated by the global pandemic hit. Technology and banking systems are functioning parallelly to cater to digital payments and currency needs. Remote working of banking employees has necessitated the accessibility to sensitive information on personal data connections (Lamssaggad et al., 2021). This has facilitated the breach of data incidents across the globe as hackers can easily access customers' banking data from personal internet networks. Cyber-attacks are more prominent in middle income nations, while they are soft targets due to a lack of global attention.

Identify the methodology used to launch the cyber-attack against the bank

The continuation of cyber threats for the banking sectors involves identifying the following discussed methods as significant contributors.

Ransomware: The most significant form of cybercrime is ransomware, which involves encrypting selective files while blocking its real user. After that, a ransom is demanded by the criminal to provide accessibility for the encrypted files. The resultant event is witnessed in organisations facing an inactivity of their systems for longer. Ransom payment does not guarantee system recovery from criminals (Blazic, 2022).

The risk from remote working: Introducing hybrid working conditions for employees has led to significant vulnerabilities as cloud-based software is used. The banking sectors face significantly higher data breach risks due to sensitive data accessibility via employees' networks and systems.

Social engineering: Social engineering exploits the most important aspect of the financial system: the customers themselves. Customers are forced to share their sensitive credentials via unauthorised networks. The forms of social engineering include whaling and phishing attacks.

Supply chain attacks: Cybercriminals target a comparatively weaker partner in the supply chain for distributing malware. Certain messages regarding products and services are circulated via the system of the target partner to make the content legitimate, at least superficially. It is an increasing cybercrime in the financial sectors globally ( Lamssaggad et al., 2021). The hackers establish the authenticity of the networks as they gain control of the networks because of poor security management by the owner of the networks.

Cyber attack’s impact on the bank’s operation

Figure 1: Risk diagram for the banking sectors
Source: (Self developed)


Table 1: Risk Matrix
Source: (Self Developed)

It can be stated from the above risk matrix that cyber security for the banking industry has been associated with data security management policies. The above matrix shows that data breach is the most severe form of cyber risk which affects banking institutions. Whereas the risks associated with remote working environments have rarely occurred in the sector. The reason for such rarity is associated with the non-accessibility of the database from personal networks other than that of the bank's commercial network (Lallie et al., 2021).

Necessary security measures required to combat cyber threats

The launch of “International Strategy to Better Protect the Global Financial System against Cyber Threats” in the year 2020 have suggested specific actions to reduce fragmentation. This can be achieved by fostering collaborations among significant international and governmental agencies, tech companies and financial firms (Dupont, 2019). The World Economic Forum has been guided by strategies that include four aspects such as clarity regarding responsibilities and roles, the urgency of international collaboration, reducing fragmentation and protection of the international financial agencies. The governmental role involves the formation of financial CERTs (computer emergency response teams) for sharing sensitive risk management data as per Israel’s FinCERT. Cyber resilience can be strengthened by appropriate response formulation in the form of arrests, sanctions and asset seizures for combating cyber threats legally.

A security framework that banks need to have in place to defend against cyber threats


Table 2: NIST cyber security framework
Source: (Self Developed)

The NIST cybersecurity framework can be utilised to assess and implement every aspect of the problem, which is currently decreasing the value of the banking sectors across the globe (Kshetri, 2019). It has been noted that effectiveness regarding cyber security management greatly improves the customer relationships a bank maintains with its existing customers.

Security technologies to protect against cyber attacks

Intrusion Detection System (IDS): Network traffic is analysed by IDS to identify signatures corresponding to known attacks in the cyber domain. The requirement of human assistance or appropriate automated systems to interpret the results is a form of utilising more security measures for the action (Akintoye et al., 2022).


Figure 2: Elements of cybersecurity diagram
Source: (Geeksforgeeks 2022)

Data Loss Prevention (DLP): DLP utilises data encryption to prevent data loss by protecting information and decrypting them only with the help of appropriate encryption keys (Chang et al., 2020). Choosing a suitable encryption technology amongst AES, DES, and RSA determines the magnitude of prevention offered.
Firewalls: A network security device that operates based on the already proposed security rules and decides whether to allow certain network traffic into the system. Firewalls may include both hardware and software and are used to address mitigating threats and monitoring traffic.

Effective strategies that can be used to regain the trust of its customers

Loyalty exchange can be an effective strategy to gain customers' trust again in the global banking sectors. The dependency of the economy on digital transactions has made the avenues for cybercrimes more prominent for attackers. Customer service quality needs to be improved significantly by every banking organisation to achieve customer loyalty. Customer engagement can be increased by truthful sharing of banking scenarios with potential customers (Broby, 2021). The banking personnel should reciprocate customer loyalty to increase the trust component of the customers.

The management of the banking sectors should take adequate measures to help every growing business in the nearby localities. Transparency associated with the banking systems shall be put forth to increase customer satisfaction. Helpful behaviour on the part of the banking institutes shall also sow the seeds of cooperation and confidence in the customers. Adopting several community-minded activities by the banks shall be beneficial to install dependency and trust in the banking sectors once again.

The banks can utilise their economic knowledge about a particular economy to discuss the ill effects and benefits of investment into particular business sectors. The anxieties of customers regarding the management of their financial resources can be solved by the banks, especially at the branch level. This attitude shall reduce anxieties and improve customer reliance on banking systems (Ahmad et al., 2020). The warmth shared within the customer relationships shall effectively increase the confidence level of the customers in their respective banking institutes.


The report has discussed the cyber threat landscaping and its challenges in the banking sectors from a global perspective. It has been noted that the ongoing transition of financial transactions into digitised platforms has widened the scope of data breaches. The potential risks associated with online monetary transactions, use of UPI platforms and unauthorised access to sensitive data storage are major reasons for more cybercrimes. The associated damages are reflected in the withdrawal of confidence from the banking sectors across the global scenario. The risk matrix has identified the probability and factors which contribute to the risks faced by banking institutes. The report has also discussed hackers' methods to carry out such fraudulent activities. At the end of the report, certain suggestions have been discussed to regain customer confidence in the banks in the newly introduced digitised banking platform.

Reference list

Read More

MIS102 Data and Networking Report 3 Sample

Task Summary

Create a network disaster recovery plan (portfolio) (1500 words, 10%-or 10%+) along with a full network topology diagram. This portfolio should highlight the competencies you have gained in data and networking through the completion of Modules 1 – 6.


The aim of this assessment is to demonstrate your proficiency in data and networking. In doing so, you will design a network disaster recovery plan for a company of your choice to demonstrate your proficiency with network design.

Task Instructions

1. Create a network disaster recovery plan (portfolio) along with a full network topology diagram for a company. (the choice of a company can be a local or international company)

2. It is recommended that to investigate the same company that was researched in Assignment 1 as this created a complete portrait of the company and becomes an e-portfolio of the work complete.

Note: The Company has branches worldwide and this should be considered when creating the network disaster recovery plan.

3. Network disaster recovery plan (portfolio)

Write a network disaster recovery plan using of 1500 words, (10%-or 10%+) The Portfolio must include the following:

An introductory section that highlights the importance of having a recovery plan.

• Whatstepsshould the company take if:

o There is a sudden internet outage.

o A malware (e.g. a virus) hasinfected the computers in the company network.

o There is no local area network for the entire company Is there a way to diagnose if this is a hardware failure. What communication protocol stack might be affected.

o Only a part of the company loses internet connection.

o There is a power outage.

o There is a natural disaster such as an earthquake, tsunami, floods or fire.

o There is a password security breach.

• Are there precautions and post-planning to ensure that the company will not repeat the same network disaster?

• Anticipate the likely questions about the network design that will be raised by the client (Please note that this may include both technical and non-technical staff of the organization).

4. Network topology diagram

• Create a full network topology diagram, that could ensure the business continuity of the company.

• The diagrams need to be your own work and need to be developed using Visio or Lucidchart or an approved graphic package. (Please seek the approval of the learning facilitator prior to commencing this activity).

• All diagrams need to be labeled and referenced if they are not your own.

• The full network topology will be part of the network disaster recovery plan and should be used to further enhance the understanding of the recovery plan.



Even a digital firm like Apple may experience network outages and catastrophes in today's rapidly developing technological ecosystem. This research digs into the complex world of network disaster recovery planning, a vital part of modern corporate operations, and adapted to the specific requirements of a multinational corporation of Apple's figure. The capacity to quickly recover from network failures, cyber-attacks, and natural disasters is critical in today's always-connected digital world. For Assignment Help, This analysis highlights the value of preventative disaster recovery procedures by describing Apple's plans to ensure the availability of critical services, the security of sensitive data, and the robustness of the company in the face of adversity. 

Network disaster recovery plan

An organization like Apple would utilize a network disaster recovery strategy to restore its whole network in the event of a catastrophe. Finding the network's weak spots, creating a list of potential risks, developing a strategy to deal with those risks, and outlining a backup plan are all critical parts of a disaster recovery strategy (Meilani, Arief & Habibitullah, 2019).

Recovery plan – It allows Apple to keep operating, providing customers, and making money in the case of a calamity.

Protect Data - It helps to make sure that essential data is kept safe and can be recovered in the case of a disaster or legal complication (Zhang, Wang & Nicholson, 2017).

Reduce Monetary Costs - Significant monetary costs might come from downtime and data loss. These losses can be mitigated with a solid recovery strategy.

Protect Reputation - A speedy recovery shows that Apple values its consumers and will do what it takes to keep them happy.

Aspects of this plan

Precautions and Planning

- Organizations like Apple can reduce the likelihood of future network catastrophes by taking these preventative measures: Maintain a recovery strategy that takes into account developing risks and emerging technology.

- Training Employees - Regularly Have personnel trained on disaster preparedness and security best practices (Butun, Osterberg & Song, 2019).

- Regular testing and exercises should be carried out to ensure the efficacy of the disaster recovery strategy.

- Audits of the security measures in place should be carried out regularly to detect any flaws or weaknesses.

When it comes to addressing network failures and catastrophes, Apple, as a leader in the computer sector, must methodically develop and implement a complete set of safeguards and preventative measures to keep operations running smoothly.

Preventing internet outages is an important consideration. Apple would be wise to employ many independent internet connections through different ISPs (Finucane et al., 2020). To mitigate the effects of an ISP outage, these links must automatically switch to a backup connection in the event of an interruption. In addition, the user experience and availability may be improved by using material Delivery Networks (CDNs) to cache material closer to end-users. To further guarantee that key services are always available, especially during peak use periods, Apple should implement Quality of Service (QoS) policies to prioritize crucial traffic during network congestion.

Apple has to implement sophisticated threat detection systems capable of identifying malware in real-time if it wants to stop infections caused by malicious software. The danger of malware intrusion through phishing efforts and other vectors can be reduced by providing frequent training programs for employees. As important as network-wide defenses are, stopping malware infestations at their source requires effective endpoint protection software. Apple has to have spares of its network gear on hand in case of LAN problems so that it can quickly restore service. Tools for constant network monitoring can spot problems and hardware breakdowns early, allowing for preventative maintenance. It is important to keep accurate and detailed records of network setups to speed up the troubleshooting process in the event of a malfunction (Schultz, 2023).

Apple should implement network segmentation to ensure that mission-critical services continue to be available in the event of a partial loss of internet connectivity. In the case of a partial outage, technologies like Border Gateway Protocol (BGP) can be utilized to redirect traffic and keep services up. To ensure the failover procedures work as intended, they must be tested often. Reducing the likelihood of a power outage happening is crucial. Apple should install UPS systems in its mission-critical data centers and server farms to keep the machines running during power outages. Extending the electrical supply with backup generators is possible. Equipment failures during power outages can be avoided with regular power system maintenance (Rosencrance, 2023).

Apple should spread its data centres out over many locations to lessen the effects of calamities that affect only a small area. If data loss occurs, it may be quickly recovered through the use of real-time data replication to alternative data centres. Having a fully functional, off-site disaster recovery site with all of the data and resources synced across to it is like having an extra firewall up. Apple needs to deploy Multi-Factor Authentication (MFA) for vital systems and accounts to stop password security breaches. Passwords should be changed often and be of a certain minimum complexity to reduce the possibility of hacking. It is also important to do security audits to find password security flaws.

As part of Apple's continuous dedication to network resilience and disaster recovery readiness, the company should continually reinforce these preventative actions. Apple is better able to protect its worldwide user base from interruptions in service because of the efforts it has taken to implement these measures throughout its network architecture (Peterson & Hilliard, 2022).

Client-Focused Question Anticipation

Questions from the Technical Staff

1. How frequently should we revise our disaster recovery strategy? Plans should be examined and revised at least once a year, or more frequently if necessary.

2. How often is crucial information backed up? Specify how often and what kind of backups will be done.

3. Can you give me a rough estimate of how long each rehabilitation plan will take? - Please include planned recuperation times.

Questions from Non-Technical Employees

1. How Will My Work Be Affected? - Describe the precautions taken to keep normal activities to a minimum.

2. To what extent do workers contribute to catastrophe preparedness? Stress the need of being punctual in reporting problems and sticking to established protocols.

3. In the event of an emergency, how will information be disseminated to staff members? Explain the current methods of interaction.
Network Diagram

Figure 1 Network topology diagram for Apple


This research has shown that even a technological centre like Apple needs a network disaster recovery strategy to keep operations running smoothly. Apple can keep up its rate of innovation and service availability by painstakingly tackling a wide range of potential calamities, from cyberattacks to natural disasters. Redundancy, backup solutions, and personnel training help the organization handle interruptions with resilience and agility, allowing it to maintain its promise to clients all across the world. Apple can keep its operations running smoothly, keep its unrivaled image intact, and weather any storm by adopting these measures and embracing a culture of readiness.


Read More

MITS5501 Software Quality, Change Management and Testing Report 2 Sample

This assessment related to the following Unit Learning Outcomes:

ULO1 Adopt specialized quality engineering and assurance procedures to improve the implementation quality and efficiency of software engineering projects using the advanced concepts and principles learnt throughout the unit.

ULO2 Independently develop clearly defined internal quality management approaches by addressing the quality factors and risks that may affect the resulting software development.

ULO3 Evolve peer review process using tools and techniques taught in the unit as well as carry out research on emerging techniques published in literature to further improve the peer review communication process


In this assessment students will work individually to develop Software Quality Assurance plan document. Carefully read the associated CASE STUDY for this assessment contained in the document MITS5501_CaseStudy_2023.pdf. From this Case Study you are to prepare the following:

1. Given the details in the Case Study, what are the software standards, practices, conventions, and metrics need to be used to improve the quality of the final product. You also need to identify the techniques to monitor the compliance of these standards.

2. Identify the tools and techniques used to perform peer reviews and the methods to reduce the risk of failure.

3. Develop a complete software quality assurance plan document based on the given case study. The document should have the following sections. However, you could add other topics based on your assumptions.

Quality Assurance Plan Document

a. Executive Summary

b. System Description

c. Management Section

d. Documentation Section

e. Standards, Practices, Conventions and Metrics

f. Peer reviews plan

g. Testing Methodology

h. Problem Reporting and Corrective action

i. QA Supporting Tools, Techniques and Methods

j. Software configuration management plan.

k. References

l. Appendices

Your report must include a Title Page with the title of the assessment and your name and ID number. A contents page showing page numbers and titles of all major sections of the report. All Figures included must have captions and Figure numbers and be referenced within the document. Captions for figures placed below the figure, captions for tables placed above the table. Include a footer with the page number. Your report should use 1.5 spacing with a 12-point Times New Roman font. Include references where appropriate. Citation of sources is mandatory and must be in the IEEE style. 



This study identifies the issues in the library management system and also help in proposing a solution through a digital library management system that can be created, implemented, and managed to fulfill the requirement of both staff and customers. For Assignment Help, This study includes different sections like the management section, documentation section, standard, practices, convention, and metrics section, review and inspection section, software configuration management plan, Quality Assurance, and testing.

Purpose Section

This section describes what software is included in the package and how it will be used, among other things. It also outlines the phases of each software product's life cycle that the SQA plan will cover. This section provides simple guidance for making sure the SQA strategy is appropriate for the program in question and its development and implementation stages [6].

Reference Document Section

All sources used to create the SQA plan are listed in detail in the Reference Documents section. This compiled document makes it simple to find resources that enhance and elaborate on the plan's primary text. Industry standards, project guidelines, process documentation, and other sources may be cited here to aid in the creation, execution, and assessment of the Software Quality Assurance strategy.

System Description

The Library Management System is an automation tool made for libraries of various sizes. This computerized system allows librarians to keep tabs on book sales, organize student information, and analyze collection depth. The system's central repository for books and member data helps avoid the kinds of problems that plague non-digital archives. In addition to improving library administration efficiency, the reporting module helps administrators with things like student enrolment, book lists, and issue/return data [5].

Figure 1 Entity Relationship Diagram of Library Management System

Management Section

There is a clear chain of command within the project's organizational structure. The Project Manager is responsible for directing the project and making sure it is completed on time, within scope, and budget. System design, coding, and database administration all fall under the purview of the development team, which consists of software developers and database administrators. The system's reliability and effectiveness are monitored by the Quality Assurance group. Feedback and user testing are provided by administrative and library employees.

Documentation Section

The software's governing documentation covers its whole lifespan, from development to maintenance. The Software Requirements Specification (SRS) is the document that first defines the parameters of the project. The project staff and interested parties check this document to make sure it covers everything. The SDD is a document that specifies the system's architecture, algorithms, and interfaces before, during, and after development. The SDD is evaluated by the development staff and domain specialists. The Test Plan and Test Cases papers outline the goals, methods, and anticipated results of the verification and validation processes. Peer evaluations and test execution outcomes are used to determine the level of sufficiency. The dependability and usefulness of the program rely on these papers, which are kept up-to-date by regular reviews, audits, and user feedback channels [1].

Standards, Practices, Conventions and Metrics Section

- Standards: This study will use conventional coding practices, such as those for file naming and commenting, as well as database best practices. Data encryption and privacy shall meet or exceed all applicable global requirements.

- Practices: Scrum, daily stand-ups, and continuous integration are just a few of the Agile development practices that will be used. Git will be used for version management, which will facilitate teamwork throughout development and help in tracking bugs.

- Conventions: We will require that all variables, functions, and database tables adhere to standard, human-readable names. Usability and accessibility guidelines will be taken into account throughout the UI design process.

- Metrics: Important performance indicators, such as system response times, error rates, and user satisfaction surveys, will be outlined. The quality of the code will be evaluated with the help of static analysis software [3].

Reviews and Inspections Section

At essential points in the planning and execution of the project, it will be reviewed and inspected by both technical and management personnel. Code quality and compliance with coding standards will be monitored by technical reviews, while project progress and resource allocation will be evaluated by management reviews. Reviews, walkthroughs, and inspections will be followed up with action items to remedy identified concerns, and approvals will be issued based on their successful completion to guarantee that the project continues to meet its quality goals and remains on schedule.

Software Configuration Management Section

Software configuration management (SCM) is an integral part of software development since it allows for the centralized management of all software and documentation revisions throughout a project's lifetime [4]. The SCMP focuses on the following topics:

1. Configuration Identification: This part of the SCMP defines how software and documentation configurations will be called and labeled. It details how CIs should be named, how versions should be numbered, and how their structure should look. It specifies who's responsible for what when it comes to creating and maintaining these identifiers.

2. Configuration Control: Software and documentation configuration management is outlined in the SCMP. There is a procedure for handling requests for modifications and putting them into effect.

3. Configuration Status Accounting: This section explains the methodology that will be used to track and report on the current state of setups. Specific sorts of status data, such as versioning, release notes, and baselines, are outlined. It also details how often and how to provide progress reports.

4. Configuration Audits: The SCMP specifies the steps to take while performing a configuration audit, whether it be an internal or external audit. It lays out the goals of an audit, the roles of the auditors conducting it, and the measures that should be taken in response to their findings.

5. Configuration Baselines: Sets the standards and methods for determining what constitutes a "configuration baseline," or a stable and officially sanctioned version of the program and documentation. It specifies how to choose, label, and file baselines.

6. Tools and Environment: The source control management (SCM) tools and environments that will be used throughout the project are covered in this section. Version control, bug tracking, and other configuration management technologies are described in depth.

7. Roles and Responsibilities: The SCMP establishes the tasks and functions of the SCM team members. The SCM manager, developers, testers, and other project participants fall under this category. It clarifies who is responsible for configuration identification, control, status accounting, audits, and baselining.

8. Training and Documentation: This document specifies the documentation and training needs of the SCM team. The SCMP, together with the process guidelines and SCM tool user manuals, are all part of the required paperwork.

9. Security and Access Control: This section deals with the topic of SCM-related security and access control. It specifies the rules for controlling access, encrypting data, and other security procedures to ensure the safety of configurations and associated data.

10. Continuous Improvement: Provisions for continuous process improvement are included in the SCMP. It specifies how the results of audits, reviews, and inspections will be included in the ongoing effort to improve SCM procedures.

Problem Reporting and Corrective Action

The program Configuration Management Plan (SCMP) for the project includes detailed instructions for tracking down and fixing bugs and other problems in the program and supporting documentation. The procedure for recording issues, particularly their severity and effect, as well as tracking and assigning them to be fixed, is outlined. It also details the processes involved in identifying problems, conducting investigations, and applying fixes.

Tools, Techniques, and Methodologies Section

Software tools, methods, and methodologies that aid Software Quality Assurance (SQA) are described in detail in the Tools, methods, and Methodologies section. Tools and processes such as Agile or Waterfall may be used for project management, along with other resources like testing frameworks, version systems of control, automated testing tools, peer review platforms, and more.

Code Control Section

Code Control describes the processes and tools used at each step of development to monitor and maintain the controlled versions of the designated program. Both software configuration management and the use of pre-existing code libraries are viable options. The integrity and traceability of software components are protected throughout the development lifecycle by this section's methodical version management of code.

Media Control Section

The Media Control section describes the processes and resources used to track down, organize, and secure the physical media that corresponds to each computer product and its documentation. This includes outlining how to back up and restore these media assets and taking precautions to prevent them from being stolen, lost, or damaged.

Supplier Control Section

In the Supplier Control section, we detail the procedures used to guarantee that third-party developers' code meets all of our expectations. Methods for ensuring that vendors obtain sufficient and thorough specifications are outlined. It specifies the measures to take to guarantee that previously generated software is compatible with the features addressed in the SQA strategy. If the software in question is still in the prototype phase, the provider in question must create and execute their own SQA plan according to the same criteria.

Records Collection, Maintenance, and Retention Section

In the section under "Records Collection, Maintenance, and Retention," the precise SQA records that will be kept are outlined. It specifies the retention period and details the processes and resources needed to create and maintain this record. Acquiring the necessary permissions and developing a strategy for execution are both key parts of putting the SQA plan into action. After the SQA plan has been implemented, an assessment of its efficacy may be performed, guaranteeing the orderly maintenance and storage of crucial documents throughout the project's lifespan.

Testing Methodology

The Testing Methodology section details the overall strategy, specific methods, and automated resources used during software testing. It specifies the various forms of testing (such as unit, integration, system, and acceptability testing) and the order in which they will be executed. Specific testing methods, such as black-box and white-box testing, as well as automated testing tools and frameworks, are described in this section. It assures that the software's functionality, reliability, and performance are tested in a systematic and well-organized manner [2].



Read More

ICT102 Networking Report 3 Sample

Assessment Objective

The objective of this assessment is to evaluate student’s ability to design and configure a network using a network simulator for a given scenario and configuring routers, switches, firewalls, and other network components based on specific requirements.


This assignment is group-based. Each group will have 3-4 students. You can continue with the same group as for the previous assessment. In this assignment you will be designing and configuring a network for a university that has a Class B IP address of There are two faculties and each faculty requires two separate subnets: one for staff and another for students. The faculty names and the number of hosts in each subnet are given below:

• Faculty of Arts: 400 students and 200 staff members

• Faculty of IT: 600 students and 300 staff members

Part 0

Declare team members contributions in the table below:

Part 1

Divide the allocated address space between department subnets as per requirements. Summarize the IP subnets and masks in a table like this:

Part 2

Construct the following network topology in GNS3 or Packet Tracer simulator. Ensure that all the hostnames and network addresses are well labelled.

Part 3

Configure the router using the assigned hostnames and IP address.

Part 4

Setup Virtual PC (VPC) in each of the four subnets as shown above. The virtual PC’s provide lightweight PC environment to execute tools such as ping, and trace route. For each faculty create two VPCs for students and two VPCs for staff. Each VPC should be able to ping the other VPC in the same subnet.

Part 5

Configure the access control list (ACL) on Router01 such that any traffic from Students’ subnets are blocked from entering the staff subnet. Traffic to and from other subnets should pass through. Pinging staff VPCs (in both faculties) from students’ VPCs should fail. In other words, student in each faculty should not be able to ping any staff computer in any faculty. Students can only ping students VPCs in any faculty. Staff members can ping any VPC (staff and students in any faculty).

Part 6

Configure DHCP services on Router01 such that all VPCs can get IP addresses dynamically assigned.

Part 7

Use the following checklist to ensure you network is configured correctly.

For each of your routers make sure to save your running configuration using the command write mem For the VPCs use the save filename command to save the configurations to a file.

Finally save the GNS3 (or Packet Tracer) project, i.e., the topology together with the startup configs. Zip the GNS3 (or Packet Tracer) project folder and submit it on Moodle with your report. Make sure your submission is complete and has all the necessary files to run the simulation.



Dynamic host configuration protocol is the client-server protocol providing an internet protocol host with an IP address and other configuration information. DHCP protocol is applied in an open networking system to transfer information from one PC to another by using the stated IP address. For Assignment Help, This protocol applies to openly communicating with nodes but there needs permission from the local host. Using DGCP protocol, this is applicable to configure automatically configure all networking components by suppressing errors that occur in the network. This report will implement the application of the DHCP protocol in the university networking system The goal of the report is to implement the DHCP protocol to communicate between two departments Arts and IT.

Network configuration and Ping status

The network of the university is a three-layer network consisting of routers, switches, and virtual PCs. There is the configuration of routers as the host of the networks situated in the first layer of the network. The second layer consists of four different switches. The third layer consists of eight different PCs for connecting the people. There are the consists of the different IP addresses of the different devices. The condition of the networking is that IT students can not communicate with the staff of any of the departments. Overall network configurations are looking the same.

Figure 1: Network topology
(Source: Self-Created)

This is a three-layer network configuration containing the IP address of each of the component. The router provides internet connectivity to all of the respective nodes. Router C3725 has been placed in the system due to the availability of 10 different ethernet ports. The characteristics of this router are stated below.

Figure 2: Proposed router configuration
(Source: Self-created)

The router contains a MiB size of 128 and NVRAM contains 256 KiB. This also contains the size of the I/O memory is 5 % of the RAM. Input and output memory is liable for storing all of the IP addresses inside the router. The next layer contains four switches that provide the connectivity of the 8 different computers. There is the connectivity of the 8 computers to the four different departments. IT students, IT staff, Arts students and Arts staff are using two different computers each. The DNS of each of the systems is the same which is There are four different branches containing four different subnet masks148.23.0.1,,, and148.23.8.1. There is the connectivity of eight different computers that contain IP addresses of the same domains. IT students are using the IP addresses of 0,1,2,3 domains respectively. The ping status is getting successful in the internal communication. Interestingly, the ping status has been damaged while communicating with the staff of IT. The overall ping status is looking like the same.

Figures: Successful ping status
(Source: Self-created)

Figures: Unsuccessful ping status
(Source: Self-created)


This report concludes that the DHCP network protocol is widely accepted in automatic configurations. Applying the DHCP protocol the entire network traffic can be prohibited in a particular domain. The report has seen that using PC 1 one can send all the required information in nodes excluding IT staff.

Reference list


Read More

ICT500 Emerging Technologies Report 3 Sample

Assessment Description and Instructions

Title: Explore emerging technologies in the field of AI and their potential impact on various industries


Artificial Intelligence (AI) is transforming the world at an unprecedented pace, impacting various fields, including healthcare, finance, education, and manufacturing, among others. Emerging technologies such as machine learning, natural language processing, computer vision, and robotics have significantly advanced the capabilities of AI systems. As these technologies continue to evolve, it is essential to explore the potential benefits and risks that come with their integration into various industries. Privacy and security are two critical aspects that must be considered as AI continues to shape our future.


• Explore emerging technologies in the field of AI and their potential impact on various industries.

• Investigate the ethical and legal implications of AI systems in terms of privacy and security.

• Analyse the potential benefits and risks of integrating AI systems in various industries from a privacy and security perspective.

• Develop recommendations on how organizations can manage the privacy and security risks associated with AI systems.


1. Choose one of these industries:
a. Healthcare
b. Finance
c. Education
d. Manufacturing

2. Provide an overview of emerging technologies in the field of AI, including machine learning, natural language processing, computer vision, and robotics. Discuss their potential impact on one of the above industries.

3. Investigate the ethical and legal implications of AI systems in terms of privacy and security. Consider aspects such as data protection, consent, transparency, and accountability. Analyze the current state of privacy and security regulations in your country or region and identify any gaps that need to be addressed.

4. Analyze the potential benefits and risks of integrating AI systems in various industries from a privacy and security perspective. Consider aspects such as data privacy, data security, cyber threats, and potential biases in AI systems. Provide examples of organizations that have successfully integrated AI systems while managing privacy and security risks.

5. Develop recommendations on how organizations can manage the privacy and security risks associated with AI systems. Consider aspects such as risk assessment, privacy by design, cybersecurity measures, and ethical considerations. Provide examples of best practices for organizations to follow.

• Introduction: Provide an overview of the topic and the objectives of the assignment.

• Literature Review: Discuss the emerging technologies in the field of AI and their potential impact on the chosen industry. Investigate the ethical and legal implications of AI systems in terms of privacy and security.

• Analysis: Analyse the potential benefits and risks of integrating AI systems in the chosen industry from a privacy and security perspective. Develop recommendations on how organizations can manage the privacy and security risks associated with AI systems.

• Conclusion: Summarize the key findings and provide recommendations for future research.


1. Abstract

This research began an exploratory trip into the emerging world of artificial intelligence (AI) technologies and its ethical and legal repercussions, with a strong emphasis on privacy and security, after AI's revolutionary influence on healthcare. For Assignment Help, A new era of healthcare opportunities has been brought about by the incorporation of AI technologies, such as robots, natural language processing, computer vision, and machine learning. It offers improved outcomes for patients and resource optimization via early illness identification, customized therapies, operational efficiency, and expedited medical research. But these incredible possibilities are matched with very difficult obstacles. Due to the importance and sheer quantity of information about patients involved, privacy and security considerations take on a significant amount of weight. Conscientious consent and openness are two ethical requirements that highlight the need for responsible AI implementation. Because AI algorithms are mysterious and often referred to as "black boxes," creative ways to ensure accountability and explicability are required. An analysis of the privacy and security legislation that is in place highlights the need for ongoing harmonization and adaptation by exposing a fragmented environment. The core of the study is its steadfast dedication to identifying these issues and making recommendations for fixes. It is morally required in the field of healthcare to consider privacy and security concerns while integrating AI. The suggestions made, which include strict security of information, informed approval, algorithm transparency, and compliance with regulations, set the path for a reliable AI ecosystem in the healthcare industry and guarantee improved care for patients and healthcare delivery going forward. 

2. Introduction and objectives

Artificial Intelligence (AI) is a game-changing force that is transforming many different sectors. Its significant influence on the healthcare sector is especially remarkable. The impact of artificial intelligence (AI) is enormous in a society where technology is becoming more and more important. These technologies—which include robots, computer vision, natural language processing, and machine learning—have created previously unheard-of opportunities to improve patient outcomes, healthcare delivery, and medical research. In the end, it might bring in a new age of better medical care and more effective resource allocation by streamlining intricate diagnostic processes, treatment plans, and administrative duties. The use of AI in healthcare is now required; it is no longer an optional step. The need for quick, informed choices has never been higher as the demand for superior healthcare services keeps rising and health information becomes more complex. However, entering this AI-driven healthcare space requires careful consideration of cutting-edge AI technology. The critical analysis of the legal and moral ramifications that AI systems bring, especially with security and privacy, is equally important. It is essential, in this regard, to thoroughly evaluate these new technologies and any ethical and legal implications that may arise.

Objectives of the report:

• To Explore the emerging technology in the AI field and its potential effects on the healthcare industry

• To investigate the ethical and legal implications of the AI system in terms of privacy and safety.

• To analyze the potential advantages and risks of integration of AI systems in the healthcare industry from a privacy and security perspective

• To develop recommendations on how the organization can handle the privacy and safety risks associated with AI systems.

3. Background/Literature review

Artificial Intelligence (AI) is a disruptive force that has permanently changed a broad range of sectors [1]. Its widespread impact redefines how operations are carried out and cuts across all industries. With a special emphasis on its enormous influence on healthcare, this section explores the complex role that artificial intelligence plays across a range of sectors. In this case, the dynamic field of developing artificial intelligence technologies—which includes robots, computer vision, machine learning, and natural language processing—takes center stage. This investigation helps to clarify the significant changes that these breakthroughs bring to the field of healthcare. Additionally, this part provides insightful information on the possible advantages and associated hazards associated with the smooth integration of AI in the healthcare industry. AI's widespread use in current sectors shows its flexibility and innovative potential. Its impact on healthcare goes beyond augmentation to transformation. As AI advances, the provision of healthcare, outcomes for patients, and research in medicine will change. Benefits include better diagnosis and treatment, faster administrative procedures, resource allocation, and medical research. These exciting advancements have dangers including security, confidentiality, morality, and regulatory compliance [2]. This part prepares for a detailed discussion of AI's position in healthcare and its legal, moral, and practical ramifications.

AI's Pervasive Impact

Artificial Intelligence has a genuinely global influence, transforming a wide range of industries, including industry, banking, education, and more. Artificial Intelligence (AI) has shown its potency in enhancing productivity, enhancing decision-making procedures, and raising overall operational excellence in several sectors. However, the area of healthcare is where artificial intelligence is most noticeable. This industry is characterized by the confluence of three key factors: the need for precision medicine, an ever-growing pool of complicated medical data, and rising healthcare needs. Incorporation with artificial intelligence has become a necessary paradigm change in answer to these complex difficulties. Through improving diagnostic precision, refining treatment plans, simplifying administrative procedures, and accelerating medical research, it has the potential to completely transform the healthcare industry. The importance of AI in healthcare is highlighted in this context since it not only meets the industry's present demands but also opens the door to more efficient and patient-centered healthcare delivery in the future [3].

Emerge AI technology in healthcare

Healthcare and Machine Learning: With its invaluable skills, machine learning has emerged as a key component of healthcare. Clinical practice is changing as a result of its competence in tasks including medical picture interpretation, patient outcome prediction, and optimum treatment choice identification. Algorithms for machine learning adapt to the changing healthcare environment by continually learning from large datasets. This allows them to provide insightful information, support doctors in making data-driven choices, and enhance patient care. The capacity to identify nuanced patterns and trends in medical data gives medical personnel an invaluable tool for illness diagnosis and treatment, which in turn improves patient outcomes and streamlines healthcare delivery [4].

Healthcare and Natural Language Processing (NLP): When it comes to healthcare, Natural Language Processing (NLP) is revolutionary because it makes it possible for computers to understand and extract information from uncontrolled medical text data. With the use of this innovative technology, healthcare facilities can now automate clinical recording procedures, glean insightful information from large volumes of medical records, and quickly retrieve vital data. NLP's capacity to decipher intricate medical narratives improves administrative efficiency while also revealing important information concealed in healthcare data. This helps medical professionals give more accurate and knowledgeable treatment, which eventually improves patient outcomes [5].

Computer Vision's Role in Medical Imaging: By using AI-driven algorithms, computer vision is driving an upsurge in medical imaging. These advanced tools can identify abnormalities in medical pictures with an unprecedented level of precision. This revolutionary technology, especially in radiology and pathology, speeds up diagnostic and treatment choices. Computer vision algorithms help medical personnel identify anomalies quickly by quickly analyzing large datasets of pictures. This improves diagnostic accuracy and speeds up the beginning of suitable treatment procedures. Combining artificial intelligence with medical imaging allows for earlier detection and better results, which is a major advancement in patient care [6].

Healthcare Robotics: Robotics is becoming more versatile as a medical tool, moving beyond its traditional use in surgery. Robots with artificial intelligence (AI) capabilities are doing medical care, drug administration, and even precision surgery. These robots raise overall healthcare effectiveness, reduce the likelihood of human error, and enhance precision. It improves the quality of life for individuals with limited mobility by offering critical help to doctors during surgery with unparalleled accuracy. One excellent illustration of how AI may complement human abilities to deliver safer, more efficient, and patient-centered care is the integration of robotics into healthcare [7].

Potential benefits in healthcare industry

Numerous benefits arise from the use of AI in healthcare:

Better Early Diagnosis: Artificial Intelligence is used to detect diseases at an early stage, which enables timely interventions and personalized treatment plans.

Prediction: Via the identification of illness risk factors, AI systems allow humans to take preventive measures. Customized treatment plans based on each patient's unique genetic and health profile are possible because of AI-enabled precision medicine.

Streamlined Administrative Tasks: Artificial Intelligence (AI) lowers paperwork and boosts operational performance by automating administrative processes.
Resources allocating optimization: AI assists in ensuring that hospitals have the right resources accessible when they're required via resource allocation optimization.

Cost reduction: AI reduces healthcare expenses by increasing operational efficiency, which raises treatment accessibility and affordability.

Drug research is accelerated by artificial intelligence (AI), which analyses large datasets and may hasten the release of novel treatments.

Improved Clinical Trials: Artificial Intelligence (AI) enables clinical trials to be more precise and efficient, which accelerates the discovery of new therapies.

Patient Engagement: By providing individuals with tailored health information, AI-powered solutions enable patients to take an active role in their treatment.

Proactive Healthcare Management: By using AI-powered applications and gadgets to track their health, patients may improve their overall health and get early intervention.

Inherent risks and challenges

AI has a lot of potential for the healthcare industry, but integrating it also comes with a lot of dangers and difficulties that need to be carefully considered. The crucial concerns of security and privacy come first. To secure patient privacy and preserve data integrity, strict data protection mechanisms must be put in place for AI systems that handle enormous amounts of sensitive medical data. The ethical issues underlying AI-driven decision-making are also quite significant, particularly in situations where human lives are involved. It is crucial to guarantee AI algorithms' accountability, justice, and transparency to establish and preserve user confidence in these systems. Fair AI model development and bias correction are necessary for equal healthcare delivery. In addition, the constantly changing field of healthcare legislation demands careful adherence to standards to stay out of trouble legally and maintain moral principles. To move forward, the remainder of the report will provide a thorough analysis of the legal and moral ramifications of artificial intelligence (AI) in healthcare. It will do this by looking at the technology's possible advantages and inherent hazards through the lenses of security and confidentiality, as well as offer advice on how to responsibly navigate these complex issues [8].

4. Discussion /Analysis

Ethical and legal implications of AI in healthcare

The incorporation of AI systems in the constantly changing healthcare scene raises a complicated web of legal and moral problems, with a special emphasis on security and privacy. Strong patient data protection is essential to the moral use of AI in healthcare. To ensure confidentiality, integrity, and availability, strict data protection procedures are necessary due to the significant amounts of sensitive information handled. Respecting relevant data protection laws, such as the General Data Protection Regulation (GDPR) in the European Union or the Health Insurance Portability and Accountability Act (HIPAA) in the United States, becomes essential to protecting patient privacy. Informed patient permission is another area where ethical considerations are relevant. Here, openness in the use of data is emphasized, and patients are given the freedom to agree or disagree based on a thorough knowledge of how their information is gathered, processed, and used. Furthermore, the opaque character of certain AI algorithms—often referred to as "black boxes"—presents a problem for accountability and transparency. Therefore, it is essential to build procedures that inform patients and healthcare professionals about AI-driven judgments. This will foster confidence in technology and ensure accountability, particularly when AI systems have an impact on important medical decisions. Given the disparities in privacy and security laws throughout nations and areas, it is essential to evaluate the current legal environment in the context of the particular application. To guarantee that AI in healthcare complies with the strictest ethical guidelines and legal requirements, any holes and inconsistencies in the legal framework must be found. This discernment makes this possible.

Analysis of privacy and security Risks in the integration of AI:

Although the use of artificial intelligence in healthcare has enormous promise, certain privacy and security concerns should be carefully considered.

• Data privacy: There is a greater chance of data breaches, unauthorized access, or abuse due to the large volume of patient data that AI systems gather and analyze. It is critical to protect patient information via data anonymization, access limits, and encryption.

• Data Security: Maintaining patient confidentiality and preventing data breaches need to ensure the safety of healthcare data. AI integration requires strong cybersecurity protocols, such as frequent safety inspections and threat assessments.

• Cyberthreats: Malware and hacking assaults are two examples of cyber threats that AI systems are susceptible to. Because medical data has such value, fraudsters see the healthcare industry as a priority target. Strong cybersecurity defense and incident response procedures need to be invested in by organizations.

• Prejudices in AI Systems: When AI systems are trained on data that contains prejudices, they may unintentionally reinforce such biases. In particular, when AI impacts medical choices, healthcare organizations need to be very careful to identify and reduce biases to guarantee fair healthcare delivery.

• Effective Integrating Examples: Despite the difficulties, a large number of healthcare institutions have efficiently incorporated AI while controlling security and privacy issues. These illustrations highlight the best practices for implementing ethical AI, safeguarding data, and maintaining cybersecurity. They provide insightful case studies for anyone attempting to negotiate the challenging landscape of integrating artificial intelligence in healthcare.

Figure 1:Ethical and privacy issues in healthcare
Source: [9]

Figure 2:Success factors of implantation of AI in healthcare
Source: [10]

5. Conclusion

As a result, this thorough analysis has clarified the revolutionary possibilities for artificial intelligence (AI) in the healthcare sector, highlighting the critical role of cutting-edge AI technologies and thoughtfully tackling ethical and legal issues, especially those about security and privacy. The main conclusions highlight how artificial intelligence (AI), furled by innovations in robotics, computer vision, natural language processing, and machine learning, has brought about a period of unparalleled potential for the healthcare industry. It facilitates early illness detection, tailored therapy, operational effectiveness, and rapid medical research, which leads to improved patient outcomes and resource efficiency. It is abundantly obvious, nevertheless, that there are significant obstacles in the way of this potential trajectory. Considering the sensitivity and sheer amount of patient data at risk, security and privacy worries are major issues. Transparency and informed consent are two essential ethical requirements. Moreover, the 'black box' character of AI algorithms demands creative solutions for accountability and explainability. An analysis of the state of privacy and security laws today shows a disjointed environment, underscoring the need for constant adaptation and harmonization to keep up with the rapid advancement of artificial intelligence. This report's importance stems from its steadfast dedication to identifying these issues and outlining a solution. In healthcare, resolving privacy and security issues with AI adoption is not a choice; it is a moral must. A trustworthy artificial intelligence ecosystem in healthcare can only be shaped by implementing the principles made here, which include strict security of information, consent that is informed, computational transparency, and a dedication to regulatory compliance.

6. References

Read More

MITS4002 Object-Oriented Software Development Report Sample

You will be marked based on your submitted zipped file on Moodle. You are most welcome to check your file with your lab tutor before your submission. No excuse will be accepted due to file corruption, absence from lecture or lab classes where details of lab requirements may be given. Please make sure that you attend Lecture EVERY WEEK as low attendance may result in academic penalty or failure of this unit.

This assessment item relates to the unit learning outcomes as in the unit descriptors.

This checks your understanding about object-oriented software development.

This assessment covers the following LOs.

LO1 Demonstrate understanding of classes, constructors, objects, data types and instantiation; Convert data types using wrapper methods and objects.

LO2 Independently analyse customer requirements and design object-oriented programs using scope, inheritance, and other design techniques; Create classes and objects that access variables and modifier keywords. Develop methods using parameters and return values.

LO3 Demonstrate adaptability in building control and loop structures in an object-oriented environment; Demonstrate use of user defined data structures and array manipulation.

Tank Circuit Program

Print your Student Name and Student Number.

1. Calculate the Capacitor with the input E, permittivity, A, cross-sectional area, d, separated distance.

2. Calculate the resonant frequency, f, of a tank circuit with the above C and input L.

C = EA and f= 1
d 2π√LC


Area = 5mm2 values: E =8.85×10−12F/m. (hardcode)

L = 1 μH Separated distances ~ 1mm (or less)

Round the Resonant frequency to two decimal places.

Here is a sample run:

Sample 1:

John Smith JS00001

Enter Capacitor Area (mm^2): 5

Enter Capacitor separated distance (mm): 0.5

Enter Inductance of the inductor (uH): 1

John Smith’s LC Tank Circuit Resonate Frequency: 16.92 MHz


1. Did you store temporary values? Where and why?

2. How did you deal with errors? (Refer to the code/code snippet in your answer)

3. If the value E, permittivity was changed regularly, how would you change your code?

Submit the following items:

1. Submit this Word document with the following:

a. Copy of your code (screenshot – includes comments in your code)

b. Screenshot of the output of your code (3 times with expected values, 2 times with non-expected values – such as a zero as an input)

c. Your written response to the questions (Q1-3)


For Assignment Help

Screenshot of the Code as required:

Output 1:

Output 2:

Output 3:

Output 4:

Output 5:

Questions and Answers:

1. Did you store temporary values? Where and why?

Temporary values are utilised in the provided Java code to store the computed capacitance (C) and resonant frequency (f). C and f are these ad hoc values. This is why they are employed:

The computed capacitance, which is an intermediate outcome obtained from user inputs and a formula (C = EA/d), is stored in the variable C.

The computed resonant frequency is another intermediate result obtained from user inputs and the formula (f = 1 / (2 * * sqrt(L * C)) and is stored in the variable f.

The storage of intermediate results for further processing and the user-friendly presentation of final results depend on these temporary variables (Chimanga et al., 2021).

2. How did you deal with errors? (Refer to the code/code snippet in your answer)

Error handling in the code is simple, and it is assumed that the user will input accurate numerical numbers. The code doesn't do much in the way of validation or error management. The user is implicitly expected to provide accurate values for the inputs (capacitor area, separation distance, and inductance), albeit this is not stated explicitly.

You can add extra validation tests to make sure the input values fall within acceptable ranges and are of the right data types in order to improve error handling and robustness. For instance, you can verify that the values are non-negative and fall within the acceptable ranges for this particular application.

3. If the value E, permittivity was changed regularly, how would you change your code?

You can adjust the code to accept this number as an input from the user if the permittivity (E) value is prone to frequent changes. The code currently has a hardcoded value for E:

We can request the user to enter the permittivity value at runtime, exactly like other input values, rather than hardcoding this number. Here's an illustration of how we may change the code to accomplish that:

This update allows for flexibility when E needs to be modified frequently because the user can now input the permittivity value each time the programme is run (Saleh et al., 2021).


Read More

TECH1100 Professional Practice and Communication Report 1 Sample

Your Task

Your first assessment in this subject requires you to write an email which determines the factors of success for IT professionals that align with the expectations of diverse stakeholders, with an emphasis on stakeholder engagement.

Assessment Description

In this assessment task, you are required to demonstrate your understanding of the factors contributing to the success of IT professionals in stakeholder engagement, particularly those that align with the expectations of diverse stakeholders. You will write an email to your manager, summarising your research findings and providing recommendations for effective stakeholder engagement. The purpose of this email is to communicate your knowledge, insights, and recommendations in a professional context. This assessment aims to achieve the following subject learning outcomes:

LO1 - Determine the factors of success for IT professionals that align with the expectations of diverse stakeholders.

Assessment Instructions

For this assessment, you will need to submit a Word Document that emulates an email, with the following items that need to be covered in the assessment:

• Imagine you are an IT professional assigned to lead a project with diverse stakeholders.

• Write an email to your manager, summarising your research findings on the factors of success for IT professionals in stakeholder engagement.

• Provide a clear and concise overview of the key factors that contribute to successful stakeholder engagement, emphasizing their alignment with diverse stakeholder expectations.

• Include examples or case studies to support your points and illustrate the practical application of the identified success factors.

• Present well-supported recommendations for effective stakeholder engagement strategies that IT professionals can implement to meet diverse stakeholder expectations.

• Address any potential challenges or considerations associated with stakeholder engagement in the email.

• Use a professional and respectful tone throughout the email, ensuring clarity and coherence in your writing.



The Manager,

Date: 24/11/2023

Subject: Effective stakeholder engagements

As per the research undertaken to identify the key success factors for an IT professional in terms of the stakeholder's engagement. Three main factors have been identified: stakeholder management laden with social responsibilities, assessment of the stakeholders' expectations from the given project and an effective communication channel for the stakeholders. It has been found that ethics plays a crucial role in the management of the stakeholders with social responsibilities. Stakeholder management enhances the aspects of trust associated with the particular project. In other words, relational management significantly involves trust factors, which can be established with the help of an effective communication process. For Assignment Help, Societal stakeholders are the stakeholders who are engaged through different socio dynamic aspects. The examples of such stakeholders for the present IT project are the immediate communities, common masses, environment, NGOs (Non governmental organisations), media, trade unions and industry associations. The IT professionals must consider the impact of their actions on the social environment, which should not be adverse in the long run or with immediate effect (de Oliveira and Rabechini Jr, 2019, p. 132). It has been found that the Australian Computer Society provides effective guidelines regarding the ethical guidelines to be followed by a successful IT professional. The PMBOK (Project Management Body of Knowledge) can be used to understand project management and its associated processes and practices (Strous et al., 2020). The stakeholders' expectations from the project can be identified by implementing a few steps of processes by the IT professional. Transparency related to the various aspects of the undertaken project needs to be stated with utmost clarity, such as the proposed timeline for the project completion, the financial budget plan, and the risks and challenges associated with the project. After that, a successful assessment of the stakeholders can be conducted with the help of the knowledge gathered through suitable communication processes.
The main stakeholders identified for the present IT project include internal stakeholders such as employees, policymakers and investors. In contrast, the external stakeholders include customers, suppliers, and associated social or legal bodies. The adaptation of disruptive technologies in the IT field has necessitated the development of invisible interactions with the potential to deliver enhanced productivity. In the digitalised era, the disruptors include Blockchain, Digital Reality, and Cognitive technology. It is important for the IT professional to efficiently communicate the uses and resultant advantages of the disruptors to earn positive support from the stakeholders in the project. The stakeholders' major expectation is to complete the project within the proposed timeline while utilising the financial resources judiciously (Frizzo Barker et al., 2020, p. 54). The utility of quantum computing needs to be addressed in regular meetings to review the necessary decisions made regarding the given project. Therefore, the stakeholders can derive satisfaction by getting involved in the decision-making process for the particular project. Effective integrity settings are necessary to address the professional environment for an IT project. The aforementioned setting helps in encouraging the professional to gain more skill based activities, thereby increasing their expertise. On the other hand, professional societies help determine the professional level of professional aspiration by the individual employees (Tavani, 2015). The recognition for better performance is also suitably carried out by the aforementioned professional environment. It is stated that stakeholder mapping can be used to address the expectations of the individual stakeholders. It has been noted that factors such as impact, interest and influence are involved in the mapping process to address the diverse expectations of the various stakeholders suitably. Segmentation of the stakeholders can be utilised to make individual communications, which can help the professional understand each stakeholder's feedback related to a particular process of business (Tristancho, 2023). The data can help rectify and change the project's decisions with suitable strategies. The internal stakeholders, such as the employees, benefited from appropriate remunerations as per the evaluation of their work. The involvement of the external stakeholders has been identified as being instrumental in addressing the valuation of the project in terms of its functionality. The investors are focused on gaining profitability regarding the invested financial resources for the project's development. Effective planning and regular communication can deal with the diverse expectations of the stakeholders with ease and convenience.

The Apple iPhone development project and the Ford Pinto Design project have addressed the issues of stakeholder engagement suitably with the help of adequate success factors of project management (Dubey and Tiwari, 2020, p. 381). An effective communication channel was ensured to run parallelly with the project so that no miscommunication could hinder the development of the projects. Secrecy of the data was maintained up to the extent that the stakeholders did not feel left behind in the process.

The main challenge in stakeholder management is associated with effective decision making for a particular project. The stakeholders are of different opinion regarding a single task which shall be employed for the project. The different perspectives and expertise of the stakeholders make room for complex decision making processes within the project in the IT industry. The varying differences in the stakeholders' expectations can form considerable conflict in the priorities for the particular project. The stakeholders will use the resources to decide on the priorities to be addressed by the present project so that their expectations are suitably addressed. The differences in the priorities make the conflicting decisions involved in the project (Anicic and Buselic, 2020, p. 250). The ongoing project might face some considerable constraints regarding the available financial resources. The expertise in the technological aspect can also act as a constraint for an IT project development. Eventually, the limited resources fail to address all the expectations bestowed on the given project by various stakeholders. The accessibility to real time data regarding the ongoing project can make room for more chaos in the feedback and evaluation process. It is highly case for the IT project that most critical stakeholders are unaware of the technological knowledge involved. The resultant evaluative feedback from such stakeholders makes the project suffer adversely.
Enhancement of the collaboration and engagement values can be suitably addressed for effective stakeholder engagement in the IT project. The collaboration value will help enhance the actual valuation of the project for internal and external stakeholders. The adequate implementation of innovative and lucrative collaboration features for individual stakeholders will help motivate them to perform better in favour of the project. An example of an effective collaborative feature might be the opportunity to access the technological knowledge required to implement the project. Therefore, The knowledge can be utilised for more such projects so that the stakeholders are drawn towards the collaboration process. Value creation for the stakeholders can be addressed by identifying the necessary interest domains of the individual stakeholders. The announcement that the best employee will be provided with a salary hike effectively addresses the issue of engagement while enhancing the entire team's performance.

The concept of a Big Room can enhance the rapport between the internal and external stakeholders for an IT project. The Big Room can help enhance the community cohesiveness, which signifies the engagement of the stakeholders at a greater level. In other words, the Big Room can be regarded as a combined workplace for all the representatives of the identified stakeholders. The representatives can be physically present or remotely work on the project via appropriate digital mediums (Coates, 2019). The Big Room can help effectively communicate the issues faced while working on the project with the natural time effect. The overall impact of the introduction of the aforementioned concept can enhance the stakeholder's engagement effectively for an IT project. The IT project can be made to address all the relevant expectations of the stakeholders along with prioritising the need to reach its ultimate goal with the help of an efficient stakeholder’s management program. The aforementioned policies and strategies can be effectively implemented to meet the demands of the stakeholders both at the internal and external levels.

The SFIA can be used to develop the skills needed for an IT professional's successful career. Skill assessment is one of the different functions for which the public and private organisations extensively use the framework. Sourcing, supply management, relationship management, contract management and customer service support have been the framework's main features. Goal directed communication with the stakeholders can be learned with the help of the framework by the IT professionals to help communicate in the formal environment (Lehtinen and Aaltonen, 2020, p. 87). Risk management is also addressed with the help of the framework.

Emotional intelligence and effective communication strategies can ultimately act as instruments to manage stakeholders' engagement in the IT project. The temporal flexibility allowed for the different stakeholder's activities for the project to make room for enhanced performance and engagement. The nature of tasks undertaken by the different stakeholders is aptly technologically advanced; therefore, adequate rest on the part of the different stakeholders is necessary. In other words, creativity and intellectual capacity can be reflected as the mind is calm while a particular task is performed (Reynolds, 2018). The recommended temporal flexibility can be suitable for enhancing stakeholders' engagement.

Thanking you.

Regards and wishes,

Project leader.

Reference List: