summation of three projects from the Data Despot DesignSafe, a web-based open platform, allows users to conduct simulations in high-performance computing resources, curate and publish their data on natural hazards engineering researches. This is made possible by the high-level end to end data management and computational services offered, which in turn enhances lifecycle research in the field. The broad range of data gathered from various investigative methods, including simulations, social science studies, experiments to mention a few, allows the researches to generate vast and complex data sets. DesignSafe comprises different services and components, among them being Data Depot, which is the heart of cyberinfrastructure and provides direct support for data sharing and collaboration. Against that background, this paper aims at giving a summation of three projects from the Data Despot. Firstly, the project PRJ-2664, Earthquake in the Philippines, looks into the description, damage to the structures, effects to the community, and the resilience aspects of the earthquake that took place on December 15, 2019. This earthquake was the fourth one to hit the region in a span of two months,…

Database Modeling, Summer 2019              ACME Direct is a direct marketer of books, music, videos and magazines.  The Marketing Director of ACME Direct tested a new book title slightly over one year ago and has decided, based on the results of the test, to promote this title to selected names from the database.  Last month ACME Direct purchased, for the first time, new list enhancement data (age, income, marital status, home value, etc.) not previously on the customer database.   Using the saved sample from the original test promotion one year ago, the analyst is preparing to develop a regression model which will aid in predicting the type of customer most likely to order this particular book title.  The Marketing Director has asked the analyst to append the new enhancement data to the sample in order to see if any of this “new enhancement data” will come into the regression equation.   Do you have any concerns regarding the marketing director’s request?  If so, explain your concerns to the Marketing Director in 75 words or less.   Yes, I am concerned…

Analyzing & Visualizing Data Summary of why the visualizations are selected The visualizations allow the user to retrieve adequate information, skills, or knowledge about the datasets. Moreover, the visualization further has an appropriate illustration of the data highlighted to allow for easy acquisition of needed knowledge. In one of the chosen visualization, the tool for the ‘search’ task tends to be application-specific, such as volume visualization. What caught attention? Some of the domain knowledge seems to have already been incorporated into different visualization systems, either unintentionally or intentionally. For instance, in case an extensive repository of knowledge is collected, it allows for a visualization system to select a practical transfer function regarding the information on the input datasets. Were the visualizations effective in presenting the provided data? The visualizations seem to operate perfectly well since automatic updates based on the performing values rely on the set filters. The visualization further helped users to transfer data in the computational space to information and knowledge; thus facilitating the visualization technology. Improving on the visualization Establishing whether the targeted audience is likely to…

 evolution of data center to determine how the phenomenon began Data centers have developed into virtual and physical infrastructures. Businesses run hybrid vehicles. Composition and the role of the data center have shifted. It was that building a data center was a 25+ year commitment that’d no flexibility in cabling inefficiencies in power/cooling, and no freedom within or between data centers. It’s about efficiencies, functionality, and speed. One area doesn’t fit all, and there are various architectures, configurations, etc.Please take a look back at the evolution of data center to determine how the phenomenon began.   A data center organizes equipment and a company’s IT operations. Data is stored, managed, and disseminated across many different devices. It houses computer systems and components such as storage and telecommunications systems. Included are redundant power systems, environmental controls, data communications connections, and safety apparatus.   Since that time, technological and physical changes in computing and information storage have led us down a winding road to where we are now. Let us take a look at the data center’s evolution, to the current cloud-centric…

How do you determine the specific hardware components needed to complete a task for a particular type of business or organization? Every organization needs to lay a structure on the use of its hardware and software. From milestone one, it is established that the hardware and software put in place determine the ways in which the organization will be subjected to a given form of operations. That implies that both hardware and software allow an organization to be in a position to complete a particular rate of activities. Without suitable hardware or software components, the organization will not factor out various activities in the right way. It is the mandate of every organization to know the way certain hardware components have been structured to come up with a reliable design to complete different tasks. In establishing the specific hardware needs of any organization, an in-depth examination should be completed to make sure proper assessment is done. Every company has diverse ways of accepting certain hardware. For instance, a financial institution will require a suitable storage facility to keep most of…

 problems which are associated with the storage and access of data Data storage is one of the key activities that is carried out by many organizations (Bao, Chen & Obaidat, 2018). The process of storing and accessing data can be easy or challenging. There are some problems which are associated with the storage and access of data. The first problem concerns the security of the data. Some data may be confidential and sensitive, and a small leakage can lead to major challenges. In this case, corporations need to ensure that their data is secure and confidential. Maintaining data security is challenging, and it requires an organization to be ready by employing all tactics available. The second problems concern the cost of running a data center. Many large corporations prefer having their data centers to acquiring storage services from other organizations. So, if the corporation wants to have a data center, it will need to save the money required for initial setup and maintenance. The next problem concerns data infrastructure. Large volumes of data need infrastructure to be stored appropriately. The…

Observation of the two excel data tables The table shows two critical aspects of our country statistical data, they include Exports Imports Exports are the types of merchandise our country sells or uses to exchange with other countries, the total value of our exports have been growing in the last three years 2018 exports was 242683 million dollars from 2017 214323 million dollars. The exports have improved the overall economy of Vietnam and generated more overall wealth for the Vietnamese people. The improved value of exports has also caused decreased unemployment among Vietnam youthful population. Imports are the type of merchandise we purchase from other countries like transport equipment, fuel Iron, and Steel. The overall value of imports has been on the increase for the past three years. In 2018 the amount was 235517 million dollars compared to 174804 in 2016. Some of the few international commercial services in Vietnam included: office and telecom equipment, electronic data processing, telecommunications, and transportation. Balance of trade is the sum of all exports of a country within a specified period which is mainly…

Why United States Should Adapt To European Data Protection Policy             For a long time, the United States has dwelled in the glory of being the pioneer when it comes to generation and storage of data. Bu t with the current changes as observed in big data, there more reasons why US should adopt the European data protection policy. First of all it’s the aspect of security in regard to citizen’s data in US whose seriousness has extensively deteriorated. Comparing the two current regimes for data access and regulations in European countries and that of United States, a lot of disparities have been noted. In UE, as far as databases for citizen information is concerned, the handling access to such information has always been guarded with a lot of secrecy (Danezis 234). On the side of United States, political parties and campaigns almost face no regulations in respect to access, collection and dissemination of citizen’s data especially for profiling practices in current political systems and have no any legal implications under US privacy law. A live example for that case…

Benefits of Big Data Security forces held information, which they thought were irrelevant initially (7/7) – (analysis – mitigation) STEPFORD & CREVICE Importance of public contribution and awareness; responsibility and accountability in assisting security forces Readily info available Other points   Challenges (these are the points to be put in here)   Proactive monitoring and reactive response is restricted due to wide area of responsibilities Mitigation Data storage problematic; too much to handle/process Mitigation Overall data and person in charge to connect/link the dots (processor) Mitigation Terrorists also able to access the ‘harnessed data’ (possibilities of ‘snitching’) Mitigation Too much information may deviate from actual task/intention at hand Mitigation[unique_solution]   Sensitivity (these are the points to be put in here) Ability to share and analyse; inter-state/regional? National/regional sensitivities prohibiting information sharing Gathering and sifting through relevant data is complicated; trigger words, patterns etc Security clearance on requested information/data (regional/international) Outcry of public intuitions?     Summary / Conclusion (these are the points to be put in here)   Conclude all relevant points to reach the aim, minus recommendations   Recommendations…

Effectiveness of Cost Model Data Set Effectiveness of Cost Model Data Set is as follows: Objectivity: The cost modeling dataset authenticates the importance of business objectives. Moreover, the business objectives are verifiable as independent documentary evidence. For example-invoice statements, cheque receipt and voucher, Simplicity: Cost model dataset is the simplest among the cost data analytical models. It provides current reports on asset valuation without taking any external help from the current continually changing market. It calculates annual depreciation without evaluating the constantly changing asset value. For example- cost estimation and cost prediction. Conservative: The cost model dataset does not include any account or asset in the dataset that is not verified and authentically proven according to the current market sale. Creative accounting can be a significant example of the conservative approach of this model. Consistency: Maintenance of flexibility is a significant feature of the cost model dataset. It evaluates the financial consistency of a business and makes predictions about the most expected future.

error: Content is protected !!
×
Hi, my name is Jenn 👋

In case you can’t find a sample example, our professional writers are ready to help you with writing your own paper. All you need to do is fill out a short form and submit an order

Check Out the Form
Need Help?
Dont be shy to ask