Database Administrator for Department Store

Question 1:

Potential deals are evaluated sales that could be accomplished if all individuals living inside of a market or region just transacted inside of the trade zone. Actual sales are contrasted with potential sales to find if there is surplus or not.  If actual sales surpass potential sales, there is presence of surplus. A surplus suggests that individual go from outside the transaction region to shop or individuals living inside of the transaction region expend more than would be regularly expected given their levels of income. On the other hand, if potential sales surpass or are more than actual sales, the transaction/trade zone endures a sales leakage. A sales leakage shows that some individuals living inside of the trade region shop outside the region or individuals living inside of the trade zone expend not exactly would be normal given their levels of income (Mukerjee, 2007).

 Retail sales data is contained in the department store database that is accumulated from the sales in the department store. The sales procedure starts when a client brings products expected for procurement to any store’s accounts. The store’s business associate filters the individual products to be bought with a standardized barcode; the transaction table is filled with this, which will later be utilized to create a business receipt posting the products, department, and cost data and other relevant information for the client. At the point when the client gives installment for the products, when payments are made by the client, the particulars are entered in the transaction table; a print out of the receipt is made available.

The entries that are developed from the department store are as follows;

 

Entity Field Description Data Type
Store Store_ID

Store_Name

Store_Loc

Store ID

Store Name

Store Location

Varchar2

Varchar2

Varchar2

Employee  Emp_ID

Emp_FN

Emp_LN

Emp_Add

Employee ID

Employee First Name

Employee Last Name

Employee Address

Number

Varchar2

Varchar2

Varchar2

Vendor Vend_ID

Comp_Name

Comp_Location

Comp_Phn_Num

Pro_ID

Cost

Vendor ID

Company’s Name

Company’s Location

Company’s Phone Num

Product ID

Product Cost

Number

Varchar2

Varchar2

Number

Number

Number

Customer Cust_ID

Cust_FN

Customer_LN

Cust_Add

Cust_Phn_Num

Customer ID

Customer First Name

Customer Last Name

Customer Address

Customer Phone Num

Number

Varchar2

Varchar2

Varchar2

Number

Product Pro_ID

Pro_Type

Pro_Menu

Pro_Price

Product ID

Product Type

Product Menu

Product Price

Number

Varchar2

Varchar2

Number

Invoice Invoice_ID

Cust_ID

Date

Invoice ID

Customer Identification

Transaction Date

Number

Number

Date

Table 1

Question 2:

It ought to be realized that database design concentrates on recognizing the information that should be put in storage. Late, inquiries can be chosen to search the information, input structures to enter new information, and reports to recover and show the information to coordinate the client needs. In the moment the most essential step is to compose the information effectively so that the database framework can deal with it proficiently. With regards to sales transactions, all organizations manage entities or items, for example, clients, sales, products and employees. From a framework viewpoint, an entity is something in this present reality that one is willing to locate, that element is portrayed by its qualities or properties. For instance, a product entity has identification, type, menu and price. In demonstrating terms an element recorded with its properties is known as a CLASS. In a programming domain a CLASS can likewise have systems or capacities that it can perform, and these can be recorded with the CLASS.

Question 3;

The crow’s foot notation was created by Gordon Everest, who initially utilized the expression inverted arrow however now just calls it a fork. For cardinality, naturally designates numerous, by its numerous toes, a man, place or thing about which one needs to find and store various information.

 

Figure 1

The characteristics or attributes portray the information we are keen on putting in store. It likewise has an identifier, which extraordinarily distinguishes one occasion of an element. Cardinality and Modality are the pointers of the business rules around a relationship. Cardinality alludes to the greatest number of times an occurrence in one element can be connected with cases in the related element (Stewart, 2008). Modality alludes to the base number of times an example in one entity can be connected with an occasion in the related element. Cardinality can be 1 or many and the sign is put on the outer closures of the relationship line, nearest to the entity (Herden & Pallack, 2004), Modality can be 1 or 0 and the sign is set within, alongside the cardinality sign. For a cardinality of 1, it is represented by a straight line. A foot with three toes is drawn for a cardinality of many. For a modality of 1 a straight line is drawn. For a modality of 0 a circle is drawn. This is an entity/element referred to as DESK. Data needs to be stored around a few desks. The data that needs to be gathered and stored are the desk id, desk id color, and the size. These are the qualities or characteristic features of the entity. Desk id is the identifier. Given the desk id, we would have the capacity to search out one particular desk, as the id number between two of them cannot be similar. ‘Is assigned’ is the name given to the relationship. It is denoted by a line which ordinarily unites together two entities. This means that neither modality nor cardinality has been allotted yet.

Question 4:

Big Data can be helpful in all levels of a business. This is unquestionably valid for supply chain management of a departmental store, the advancement of the store’s supply chain transactions, for example, new item improvement, generation, and appropriation, to boost income, benefits, and client esteem. Big Data administration has positive consequences for supply chain network .By reinforcing its inventory network, the store can get the services and goods a buyer needs to them rapidly and effectively. Entities that show such value to buyers can expand repeat purchase conduct by the buyers, develop shopper loyalty, and get more value from the client.

Big Data tools will permit the department store to create complex scientific models that gauge margins if diverse blends of suppliers are picked. These models can consider an extensive variety of variables, for example, the extra expenses because of fluctuations in the rate with which distinctive suppliers can convey their products; one-time switching costs, for example, long haul contract terminations; and even approximation of supplier unwavering quality, which the department store can use to create performance forecasts of different vendor mixes. In order to maximize the profits, the management can choose those with highest rate of return.

Big data empowers sales and marketing to identify changes in purchaser conduct and trends at on both micro and macro levels. Realizing that a purchaser is going to exit from a business cycle, taking into account past examples of conduct, big data empowers a business administrator to either rapidly exclude the opportunity or capitalize on it. Having the capacity to recognize an adjustment in employee performance empowers logical training to be conveyed when it can have the most effect. That is a more viable procedure other forms of training that do not have the real-time aspect (Hurwitz et al., 2013). Organizations that monitor by concentrating on changes in business trends have an essential early cautioning framework; a center achievement variable for social organizations. Cloud’s novel Intelligent Forecasting Suite incorporates instruments to fuse administration judgment and multi-dimensional projections. Marketing and sales can utilize big data applications to distinguish approaches to enhance return on investment by adjusting arrangement of content and projects with changing purchaser desires. These same experiences empower Marketing to show employees how to have more informed decisions.

A stored procedure is simply an arranged SQL code that one can spare so that they can reuse the code again and again. On the off chance that one considers an inquiry that they compose again and again, rather than writing that query every time they would save it as a stored procedure and afterward simply reuse the stored procedure to execute the SQL code that you spared as a major aspect of the stored procedure. Notwithstanding running the same SQL code again and again, with the stored procedure it is possible to pass parameters, so relying upon what the need is, the stored procedure can act as required.

An example is the Create (or Replace) function return data type and Begin PL/SQL statements…Return. For instance, the QSL SUM function can be used to determine or return to the sales of a particular department and provide the name of the relevant department. QSL SUM is a preferable procedure because it can be used to determine the salaries and different ranges in a given period, it can be used to determine the net income of the business and it can be used to return the name of a department and total sales. 

Question 5:

With regards to administrations, pricing is an exceptionally touchy issue. While having a decent valuing procedure may not be an upper hand, having the wrong pricing structure can mean calamity. The wrong price, at any rate, means diminished benefits, yet it might likewise lead an organization into income issues and, in the end the organization becoming insolvent. On cloud based solutions, everything, stage, system, and programming is sold and termed as a service that depends on some kind of utilization metric. This takes into consideration a much more extensive scope of pricing procedures, letting organizations explore different avenues regarding diverse models as they come. In the meantime, this more extensive scope of choices holds more risks for errors and that is what many organizations are trying to avoid.

One example is the package based model, where the clients pay a level charge or a flat rate for a bundle that gives them the privilege to a sure utilization level, and need to pay a much higher price for more use (Li, 2011). This model takes into account the streamlining of both use and benefits. Clients, who use less, pay a higher use price; however can get some rebate since they will most likely not surpass their limit. Heavy users, then again, can get an appealing discount. In both cases, the overabundance use charges can without much of a stretch cover any costs created by some customer going over the limit. In the meantime, the bundle model is additionally fascinating for customers. It takes into account vastly improved planning of costs, which is essential for organizations of all sizes.

Cloud-facilitated applications and administrations may be extremely useful to conveyance or facilitating organizations that produce, host and convey services. They likewise offer advantages to big organizations that by and large use facilitated and cloud-based arrangements. Organizations, for example, Amazon and Microsoft are unmistakable cloud suppliers. With regards to the introductory expenses of utilizing cloud administrations it has a tendency to be fundamentally lower than expanding on-premises of IT frameworks. As per scholarly research, the saving could be an average of 40% contingent upon organization size, albeit more research is required to be carried out. Since the Web administration’s use is metered per volume and time usage, purchasers’ advantage from lower and adaptable evaluating choices (Vanmechelen, Altmann & Rana, 2012). These choices range from fixed prices based on a minimum level and pay as you go.

The cloud offers a great deal of space at experimentation with pricing structures and prices, both to suppliers and customers, therefore both of these parties experiment by giving things a shot and test however much as could be expected with distinctive pricing models to see what fits best. As the illustrations demonstrate, not contemplating pricing or making inaccurate suspicions can bring about lessened benefits or even significant calamity for organizations who offer clouding services. Also, for everybody who is buying clouding services, understanding the thoughts behind the pricing structures used by suppliers can allow better use of the services, or even to negotiate for better rebates.

Information stored in cloud base system should be closely observed. This is particularly genuine when you’re conveying IaaS in a public cloud. You have to know who is getting to the data, how the data was gotten to, the area from which it was gotten from and what happened to that data after it was acquired, was it sent to another client or replicated to another site? An organization can take care of these issues by utilizing current Rights Management benefits and applying limitations to all data that is considered fundamental. Make approaches for this data and afterward deploy those strategies in a way that doesn’t require user intercession.

  The other security consideration is based on authentication. With a specific end goal to have a successful Data Loss Prevention arrangement an entity needs to have robust authentication and approval systems set up. It is common knowledge that user-name and secret password is not the most secure verification component. Consider two elements or multi-component validation for all data that should be limited. What’s more, an entity should consider integrating the authentication or approval strategies in light of the level of trust they have for every identity supplier for cloud solutions. The level of approval they empower from a supplier, for example, Google Mail is going be a considerable measure lower than if the supplier is the corporate Active Directory environment. 

End to End encryption is an important security consideration. The cloud based services needs to exploit encryption from end-to-end. Organizations should utilize whole encryption, which guarantees that all information on the disk, not simply client information documents, is scrambled. This additionally forestalls attacks when offline. Notwithstanding entire disk encryption, ensure that all correspondences to host working frameworks and virtual machines in the framework are encrypted (Yu & Jajodia, 2007). This should be possible over IPsec. This incorporates interchanges from administration stations, as well as correspondences between the virtual machines themselves. Likewise, when accessible, deploy systems, for example, homomorphic encryption to ensure safety and security for the users.

Software as a service (SaaS) puts the greater part of the obligation regarding security administration with the cloud supplier and is usually utilized for administrations, for example, client relationship administration and bookkeeping. This prominent choice is viewed as generally safe in light of the fact that it basically bargains just with programming and not equipment or capacity. With SaaS, organizations can control who has entry to these cloud administrations and how the applications are designed. The multifaceted nature of programming establishment, upkeep, overhauls and fixes, in the interim, is mechanized and took care of by the supplier.

Platform as a service (PaaS) is like SaaS yet regularly incorporates further application-particular programming to offer organizations some assistance with creating modified and customized services. For instance, an organization utilizing PaaS could add to its own custom cloud programming to perform some particular undertaking, while SaaS offerings by and large are given as it is. Most PaaS offerings are multi-occupant, implying that a percentage of the administrations may be imparted to different organizations. This implies it is basic for organizations that use PaaS to have a very much characterized trust association with the supplier on security matters, for example, authentication, source code dispersion, route history, and application utilization.

With infrastructure as a service (IaaS), organizations get a unified, adaptable cloud bundle that offers more tightly control over numerous parts of a conventional IT framework than they have with SaaS or PaaS. Organizations utilizing IaaS pay on a per-use premise to get to administrations and applications, and can likewise tap the working framework that facilitate virtual pictures, systems administration and capacity situations for extra control. Frequently, IaaS is offered as a private cloud, giving organizations complete inside control over security and accessibility (Bryant, 2014).

Question 6:

The utilization of a distributed Database Management Systems framework is really proper, on the grounds that the DDBMS administers the capacity and handling of sensibly related information over interconnected PC frameworks in which both information and preparing are conveyed among a few destinations. The Database Management Systems information are situated close to the site of most noteworthy interest and information are scattered to match business prerequisites, it gives quicker information access, new destinations can be added to the system without influencing the operations of different locales, it has less peril of single-point failure, which implies when one of the PCs comes up short, the workload is ‘grabbed’ by different workstations and information are additionally appropriated to numerous locales, and the end client can get to any accessible duplicate of the information and also end client’s solicitation is prepared by any processor at the information area, and so on. 

The DBA must be discerning of the elements of the DBMS with a specific end goal to apply the best possible methods for upgrading the execution of database structures. A large portion of the significant DBMSs bolster the accompanying strategies albeit maybe by distinctive names. Each of the accompanying methods can be utilized to tune database execution. Partitioning is breaking a solitary database table into segments put away in various documents, Raw partitions versus file systems is picking whether to store database information in an OS-controlled record or not, Indexing is picking the best possible files and alternatives to enable proficient questions.

Question 7:

Lost updates are a concurrency control issue in which information redesigns are lost amid the simultaneous execution of transactions. Uncommitted data is the point at which one is attempting to accomplish concurrency control, uncommitted information cause issues with information uprightness and consistency. These issues happen when two exchanges are executed simultaneously and the first exchange is moved back after the second exchange has as of now gotten to the uncommitted data, subsequently disregarding the isolation property of transactions. Lost upgrade issue may happen when two simultaneous exchanges, for example, X and Y, are updating the same information component and one of the upgrades is overwritten by the other exchange.

Uncommitted data happens when two exchanges x and y are executed simultaneously and the first exchange x is moved back after the second exchange y has as of now gotten to the uncommitted information in this manner disregarding the isolation property of transactions. Utilizing the same exchange amid the lost upgrades, example x has two atomic parts, one of which is the upgrade of the stock; the other conceivable part is the upgrade of the receipt total not displayed. X is compelled to move back due to a mistake amid the upgrading of the receipt’s total; it will move back the distance, fixing the stock upgrade also.

To set you up to manage the issues postured by multi-user access to information, it is important to survey a percentage of the fundamental ideas that identify with concurrency. The transaction is the foundation of information integrity in multi-user databases, and the establishment of all concurrency plans. A transaction is characterized as a solitary unbreakable work that influences some information. The majority of the adjustments made to information inside of a transaction are consistently connected to a database with a COMMIT proclamation, or the information influenced by the progressions is consistently returned to its introductory state with a ROLLBACK articulation. Once an exchange is conferred, the progressions made by that exchange get to be permanent and are made obvious to different exchanges and different clients. 

Transactions dependably happen after some time, albeit most exchanges happen over a brief time frame. Since the progressions made by an exchange aren’t official until the exchange is conferred, every individual transaction must be segregated from the impacts of different transactions. The system used to uphold transaction segregation is the lock. A database uses an arrangement of locks to keep exchanges from meddling with one another. Exchanges can meddle with one another by permitting one exchange to change a bit of information that another exchange is in the process of altering.

In order to provide data disaster management functions, there are few options that should be made available. Since the system is the essential means by which data is dispersed and shared, it is regularly vulnerable to cyber attacks, for example, denial of service attacks and spoofing. The system is likewise the primary vehicle for the transmission of harmful programming for example, worms and Trojans, in between different hosts. Moreover, the presentation of wireless systems administration into the retail environment has diminished the security of the system, and expanded the unpredictability required keeping in mind the end goal to secure it.

There are several security threats that may occur when managing the department store database. Excessive privileges can be a threat to the system, at the point when employees are allowed default database benefits that surpass the necessities of their occupation capacities, these benefits can be abused e.g. an employee can modify the amount of inventory in the database for personal interests. The other threat is database injection attacks; the two noteworthy types of database injection attacks are SQL infusions that target conventional database frameworks and target of big data by NoSQL infusions. A significant point to acknowledge here is that, despite the fact that it is in fact genuine that big data arrangements are impenetrable to SQL infusion attacks on the grounds that they don’t really utilize any SQL-based system, they are still vulnerable to the same basic class of attack. Information backup is frequently unprotected from attack; accordingly, various security breaks have included the stealing of database backups. Moreover, inability to review and screen the exercises of employees who have low-level access to delicate data can put information at risk. One of the remedies is encrypting the databases and archiving external data.  The management should monitor access action and usage trends continuously to recognize information spillage, unapproved SQL and big data transactions, and convention and framework attacks. The user access rights should be properly managed, dormant users should be identified, excessive privileges should be removed, and lastly sensitive information should be classified (Thuraisingham, 2005).

Leave a Reply