Spring Sale Special Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: xmasmnth

You've migrated a Hadoop job from an on-premises cluster to Dataproc and Good Storage.

You've migrated a Hadoop job from an on-premises cluster to Dataproc and Good Storage. Your Spark job is a complex analytical workload fiat consists of many shuffling operations, and initial data are parquet toes (on average 200-400 MB size each) You see some degradation in performance after the migration to Dataproc so you'd like to optimize for it. Your organization is very cost-sensitive so you'd Idee to continue using Dataproc on preemptibles (with 2 non-preemptibles workers only) for this workload. What should you do?

A.

Switch from HODs to SSDs override the preemptible VMs configuration to increase the boot disk size

B.

Increase the see of your parquet files to ensure them to be 1 GB minimum

C.

Switch to TFRecords format (appr 200 MB per We) instead of parquet files

D.

Switch from HDDs to SSDs. copy initial data from Cloud Storage to Hadoop Distributed File System (HDFS) run the Spark job and copy results back to Cloud Storage

Google Professional-Data-Engineer Summary

  • Vendor: Google
  • Product: Professional-Data-Engineer
  • Update on: Mar 15, 2026
  • Questions: 400
Price: $52.5  $149.99
Buy Now Professional-Data-Engineer PDF + Testing Engine Pack

Payments We Accept

Your purchase with ExamsVCE is safe and fast. Your products will be available for immediate download after your payment has been received.
The ExamsVCE website is protected by 256-bit SSL from McAfee, the leader in online security.

examsvce payment method