This blog walks through approaches to implement custom notifications using SMTP, SendGrid, Azure Logic Apps, and Microsoft Graph API. Using SMTP with Python inside a Databricks notebook, you can generate an Excel report and send it via email whenever a pipeline fails.This blog walks through approaches to implement custom notifications using SMTP, SendGrid, Azure Logic Apps, and Microsoft Graph API. Using SMTP with Python inside a Databricks notebook, you can generate an Excel report and send it via email whenever a pipeline fails.

Custom Email Notifications for Databricks Pipeline Failures

2025/09/30 06:13

When working with Databricks pipelines and workflows, failures are inevitable. While Databricks provides built-in notifications for job failures, these alerts are often not customizable and may not fit specific reporting or formatting needs. A more flexible and cost-effective approach is to set up custom email notifications that include pipeline details and error messages in a structured format, such as an Excel attachment.

This blog walks through approaches to implement custom notifications using SMTP, SendGrid, Azure Logic Apps, and Microsoft Graph API.

Why Custom Notifications?

  • Flexible formatting: Include pipeline metadata, error messages, and runtime details.
  • Attachments: Share structured reports (Excel, CSV, etc.) instead of plain text.
  • Cost efficiency: Avoid additional third-party monitoring solutions.
  • Integration options: Easily plug into existing email infrastructure.

Approach 1: SMTP-Based Notifications

Using SMTP with Python inside a Databricks notebook, you can generate an Excel report and send it via email whenever a pipeline fails.

Example Implementation

import smtplib from email.mime.multipart import MIMEMultipart from email.mime.base import MIMEBase from email.mime.text import MIMEText from email import encoders from io import BytesIO import pandas as pd  #Sample pipeline history df = spark.createDataFrame([ ('pipeline1', 'success', '7min'), ('pipeline1', 'fail', '3min'), ('pipeline1', 'success', '10min') ], ["PipelineName", "Status", "Duration"])  # Convert DataFrame to Excel output = BytesIO()  with pd.ExcelWriter(output, engine='xlsxwriter') as writer:  df_pd = df.toPandas()  df_pd.to_excel(writer, index=False, sheet_name='Sheet1')  workbook = writer.book  worksheet = writer.sheets['Sheet1'] 
# Apply formatting header_format = workbook.add_format({     'bold': True,     'bg_color': '#FFF00',     'border': 1,     'align': 'center',     'valign': 'vcenter' }) for col_num, value in enumerate(df_pd.columns):     worksheet.write(0, col_num, value.upper(), header_format)  cell_format = workbook.add_format({'border': 1}) for row in range(1, len(df_pd) + 1):     for col in range(len(df_pd.columns)):         worksheet.write(row, col, df_pd.iloc[row-1, col], cell_format)  for i, col in enumerate(df_pd.columns):     worksheet.set_column(i, i, 20) output.seek(0)  # Email configuration sender = "from@example.com" receiver = "to@example.com" subject = "Pipeline Execution Report" body = """Hello Team,  Please find the attachment of the latest pipeline report.  Thanks, Pipeline Team"""  msg = MIMEMultipart() msg['From'] = sender msg['To'] = receiver msg['Subject'] = subject msg.attach(MIMEText(body, 'plain'))  part = MIMEBase('application', 'vnd.openxmlformats-officedocument.spreadsheetml.sheet') part.set_payload(output.read()) encoders.encode_base64(part) part.add_header('Content-Disposition', 'attachment; filename="pipeline_report.xlsx"') msg.attach(part)  smtp_server = "smtp.office.com" smtp_port = 587  with smtplib.SMTP(smtp_server, smtp_port) as server:     server.starttls()     server.login(sender, "sender_password")     server.send_message(msg)  print("Email sent successfully with Excel attachment") 

Scheduling Notifications

You can automate the notification trigger by scheduling the notebook:

Option 1: Databricks Jobs

  • Create or edit a Databricks job.
  • Add a task dependency so the notification script runs only if the previous task fails.
  • This ensures error details are captured and reported immediately.

Option 2: Azure Logic Apps

  • Configure a Logic App that listens for pipeline failures.
  • Pass pipeline details and attachments via an API call in JSON format.
  • Logic Apps handle email delivery and retry mechanisms.

Conclusion

While Databricks provides basic failure notifications, extending them with custom SMTP or Logic App workflows ensures:

  • Rich, formatted reports.
  • Team visibility with detailed context.
  • Seamless integration with enterprise communication tools.

This approach is cost-effective, scalable, and easily adaptable for large-scale pipeline monitoring.

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.
Share Insights

You May Also Like

Crypto Funds Hit Record $6-B Inflows

Crypto Funds Hit Record $6-B Inflows

The post Crypto Funds Hit Record $6-B Inflows appeared on BitcoinEthereumNews.com. They say journalists never truly clock out. But for Christian, that’s not just a metaphor, it’s a lifestyle. By day, he navigates the ever-shifting tides of the cryptocurrency market, wielding words like a seasoned editor and crafting articles that decipher the jargon for the masses. When the PC goes on hibernate mode, however, his pursuits take a more mechanical (and sometimes philosophical) turn. Christian’s journey with the written word began long before the age of Bitcoin. In the hallowed halls of academia, he honed his craft as a feature writer for his college paper. This early love for storytelling paved the way for a successful stint as an editor at a data engineering firm, where his first-month essay win funded a months-long supply of doggie and kitty treats – a testament to his dedication to his furry companions (more on that later). Christian then roamed the world of journalism, working at newspapers in Canada and even South Korea. He finally settled down at a local news giant in his hometown in the Philippines for a decade, becoming a total news junkie. But then, something new caught his eye: cryptocurrency. It was like a treasure hunt mixed with storytelling – right up his alley! So, he landed a killer gig at NewsBTC, where he’s one of the go-to guys for all things crypto. He breaks down this confusing stuff into bite-sized pieces, making it easy for anyone to understand (he salutes his management team for teaching him this skill). Think Christian’s all work and no play? Not a chance! When he’s not at his computer, you’ll find him indulging his passion for motorbikes. A true gearhead, Christian loves tinkering with his bike and savoring the joy of the open road on his 320-cc Yamaha R3. Once a speed demon who hit…
Share
BitcoinEthereumNews2025/10/07 04:10
Share