Muutke küpsiste eelistusi

Microsoft Azure Data Solutions - An Introduction [Pehme köide]

  • Formaat: Paperback / softback, 304 pages, kõrgus x laius x paksus: 232x192x15 mm, kaal: 568 g
  • Sari: IT Best Practices - Microsoft Press
  • Ilmumisaeg: 01-Oct-2021
  • Kirjastus: Addison Wesley
  • ISBN-10: 0137252501
  • ISBN-13: 9780137252503
Teised raamatud teemal:
  • Formaat: Paperback / softback, 304 pages, kõrgus x laius x paksus: 232x192x15 mm, kaal: 568 g
  • Sari: IT Best Practices - Microsoft Press
  • Ilmumisaeg: 01-Oct-2021
  • Kirjastus: Addison Wesley
  • ISBN-10: 0137252501
  • ISBN-13: 9780137252503
Teised raamatud teemal:
Direct from Microsoft, this Exam Ref is the official study guide for the new Microsoft DP-200 Implementing an Azure Data Solution certification exam.


Exam Ref DP-200 Implementing an Azure Data Solution offers professional-level preparation that helps candidates maximize their exam performance and sharpen their skills on the job. It focuses on specific areas of expertise modern IT professionals need to demonstrate real-world mastery of data solution provisioning, data ingestion and transformation, security, data governance, and performance monitoring. Coverage includes:

  • Understanding Azure data solutions: data storage and data processing concepts
  • Implementing Azure data storage solutions: implementing non-relational and relational data stores, and managing data security
  • Managing and developing data processing for Azure data solutions: developing batch processing and streaming solutions
  • Monitoring and optimizing Azure data solutions: monitoring data storage and data processing; troubleshooting data partitioning bottlenecks; optimizing Data Lake Storage, Stream Analytics, and Azure Synapse Analytics; and managing the data lifecycle

Microsoft Exam Ref publications stand apart from third-party study guides because they:

  • Provide guidance from Microsoft, the creator of Microsoft certification exams
  • Target IT professional-level exam candidates with content focused on their needs, not “one-size-fits-all” content
  • Streamline study by organizing material according to the exam’s objective domain (OD), covering one functional group and its objectives in each chapter
  • Offer concise overviews of every skill covered by the exam
  • Feature “Thought Experiments” and “Thought Experiment Answers” to guide candidates through a set of “what if?” scenarios, and prepare them more effectively for Pro-level style exam questions
  • Explore big picture thinking around the planning and design aspects of the IT pro’s job role
  • Deliver exam tips, summaries, and inline questions and answers to help you identify key points
  • Include “Need more review?” reader aids pointing to additional study materials when readers need them

For more information on Exam DP-200 and the Microsoft Certified: Azure Data Engineer credential, visit https://docs.microsoft.com/en-us/learn/certifications/azure-data-engineer



Direct from Microsoft, this Exam Ref is the official study guide for the new Microsoft DP-200 Implementing an Azure Data Solution certification exam. Exam Ref DP-200 Implementing an Azure Data Solution offers professional-level preparation that helps candidates maximize their exam performance and sharpen their skills on the job. It focuses on specific areas of expertise modern IT professionals need to demonstrate real-world mastery of data solution provisioning, data ingestion and transformation, security, data governance, and performance monitoring. 
Acknowledgments vi
About the Authors vii
Introduction ix
Chapter 1 Understand Azure data solutions
1(32)
Data-storage concepts
2(16)
Types of data
2(2)
Understand data storage
4(4)
Data storage in Azure
8(10)
Data-processing concepts
18(8)
Batch processing
19(1)
Stream processing
19(1)
Lambda and kappa architectures
20(2)
Azure technologies used for data processing
22(4)
Use cases
26(3)
Advanced analytics
26(1)
Hybrid ETL with existing on-premises SSIS and Azure Data Factory
26(2)
Internet of things architecture
28(1)
Summary
29(2)
Summary exercise
31(1)
Summary exercise answers
31(2)
Chapter 2 Implement data-storage solutions
33(82)
Implement non-relational data stores
33(55)
Implement a solution that uses Cosmos DB, Azure Data Lake Storage Gen2, or Blob storage
34(12)
Implement partitions
46(2)
Implement a consistency model in Cosmos DB
48(2)
Provision a non-relational data store
50(20)
Provision an Azure Synapse Analytics workspace
70(4)
Provide access to data to meet security requirements
74(8)
Implement for high availability, disaster recovery, and global distribution
82(6)
Implement relational data stores
88(16)
Provide access to data to meet security requirements
88(9)
Implement for high availability and disaster recovery
97(2)
Implement data distribution and partitions for Azure Synapse Analytics
99(3)
Implement PolyBase
102(2)
Manage data security
104(6)
Implement dynamic data masking
104(3)
Encrypt data at rest and in motion
107(3)
Summary
110(1)
Summary exercise
111(3)
Summary exercise answers
114(1)
Chapter 3 Manage and develop data processing for Azure Data Solutions
115(104)
Batch data processing
116(73)
Develop batch-processing solutions using Azure Data Factory and Azure Databricks
119(17)
Implement the Integration Runtime for Azure Data Factory
136(6)
Create pipelines, activities, linked services, and datasets
142(15)
Create and schedule triggers
157(5)
Implement Azure Databricks clusters, notebooks, jobs, and autoscaling
162(6)
Ingest data into Azure Databricks
168(8)
Ingest and process data using Azure Synapse Analytics
176(13)
Streaming data
189(24)
Stream-transport and processing engines
191(1)
Implement event processing using Stream Analytics
192(3)
Configure input and output
195(6)
Select the appropriate built-in functions
201(12)
Summary
213(3)
Summary exercise
216(2)
Summary exercise answers
218(1)
Chapter 4 Monitor and optimize data solutions
219(62)
Monitor data storage
220(24)
Monitor an Azure SQL Database
220(5)
Monitor Azure SQL Database using DMV
225(1)
Monitor Blob storage
226(3)
Implement Azure Data Lake Storage monitoring
229(1)
Implement Azure Synapse Analytics monitoring
229(2)
Implement Cosmos DB monitoring
231(1)
Configure Azure Monitor alerts
232(4)
Audit with Azure Log Analytics
236(8)
Monitor data processing
244(16)
Monitor Azure Data Factory pipelines
244(3)
Monitor Azure Databricks
247(2)
Monitor Azure Stream Analytics
249(1)
Monitor Azure Synapse Analytics
250(5)
Configure Azure Monitor alerts
255(1)
Audit with Azure Log Analytics
256(4)
Optimize Azure data solutions
260(16)
Troubleshoot data-partitioning bottlenecks
260(1)
Partitioning considerations
261(2)
Partition Azure SQL Database
263(2)
Partition Azure Blob storage
265(1)
Partition Cosmos DB
266(1)
Optimize Azure Data Lake Storage Gen2
267(1)
Optimize Azure Stream Analytics
268(1)
Optimize Azure Synapse Analytics
269(3)
Manage the data life cycle
272(4)
Summary
276(2)
Summary exercise
278(1)
Summary exercise answers
279(2)
Index 281
Daniel A. Seara is an experienced software developer. He has more than 20 years as a technical instructor, developer, and development consultant. Daniel has worked as a software consultant in a wide range of companies in Argentina, Spain, and Peru. He has been asked by Peruvian Microsoft Consulting Services to help several companies in their migration path to .NET development. Daniel was Argentina's Microsoft regional director for four years and was the first nominated global regional director, a position he held for two years. He also was the manager of the Desarrollador Cinco Estrellas I (Five Star Developer) program, one of the most successful training projects in Latin America. Daniel held Visual Basic MVP status for more than 10 years, as well as SharePoint Server MVP status from 2008 until 2014. Additionally, Daniel is the founder and dean of Universidad. NET, the most-visited Spanish-language site to learn .NET. In 2005, he joined Lucient, the leading global company on the Microsoft Data Platform, where he has been working as a trainer, consultant, and mentor.

 

Francesco Milano has been working with Microsoft technologies since 2000. Francesco specializes in the .NET Framework and SQL Server platform, and he focuses primarily on back-end development, integration solutions, and relational model design and implementation. He is a regular speaker at top Italian data platform conferences and workshops. Since 2013, Francesco has also explored emerging trends and technologies pertaining to big data and advanced analytics, consolidating his knowledge of products like HDInsight, Databricks, Azure Data Factory, and Azure Synapse Analytics. In 2015, Francesco joined Lucient, the leading global company on the Microsoft Data Platform, where he has been working as a trainer, consultant, and mentor.

 

Danilo Dominici is an independent consultant, trainer, and speaker, with more than 20 years of experience with relational databases and software development in both Windows and *nix environments. He has specialized in SQL Server since 1997, helping customers design, implement, migrate, tune, optimize, and build HA/DR solutions for their SQL Server environments. Danilo has been a Microsoft Certified Trainer since 2000. He works as a trainer for the largest Microsoft learning partners in Italy, teaching SQL Server and Windows server courses. He is also a regular contributor and speaker at community events in Italy and is the co-leader of PASS Global Italian Virtual Chapter, the Italian-speaking virtual group. For his commitment, Microsoft has recognized Danilo as a Data Platform MVP since 2014. Danilo was a SQL Server and PostgreSQL DBA and a VMWare administrator in Regione Marche, a local government organization in Italy, from 2004 to 2017. Later he became a DBA and a web developer within the Polytechnic University of Marche. Danilo has been part of Lucient since 2011.