SlideShare a Scribd company logo
1 of 37
http://netponto.org 
50ª Reunião Presencial @ LISBOA 
DateTime.Parse(“22-11-2014", new CultureInfo("pt-PT")); 
hashtag #netponto
50ª Reunião Lisboa – 22-11-2014 http://netponto.org 
Async-Await Best Practinces 
in 10 minutes 
Paulo Morgado
Paulo Morgado 
• http://PauloMorgado.NET/ 
• @PauloMorgado 
• http://about.me/PauloMorgado 
• http://www.slideshare.net/PauloJorgeMorgado 
• http://pontonetpt.org/blogs/paulomorgado/ 
• http://blogs.msmvps.com/paulomorgado/ 
• http://weblogs.asp.net/paulomorgado 
• http://www.revista-programar.info/author/pmorgado/
For goodness’ sake, 
stop using async void!
Async void is only for event handlers 
Principles 
Async void is a “fire-and-forget” mechanism... 
The caller is unable to know when an async void has finished 
The caller is unable to catch exceptions thrown from an async void 
(instead they get posted to the UI message-loop) 
Guidance 
Use async void methods only for top-level event handlers (and their like) 
Use async Task-returning methods everywhere else 
If you need fire-and-forget elsewhere, indicate it explicitly e.g. “FredAsync().FireAndForget()” 
When you see an async lambda, verify it
The problem is events. 
They’re not going away.
Async over events 
• Principles 
Callback-based programming, as with events, is hard 
• Guidance 
If the event-handlers are largely independent, then leave them as events 
But if they look like a state-machine, then await is sometimes easier 
To turn events into awaitable Tasks, use TaskCompletionSource
Is it CPU-bound, 
or I/O-bound?
Threadpool 
Principles 
CPU-bound work means things like: LINQ-over-objects, or big iterations, or computational inner loops. 
Parallel.ForEach and Task.Run are a good way to put CPU-bound work onto the thread pool. 
Thread pool will gradually feel out how many threads are needed to make best progress. 
Use of threads will never increase throughput on a machine that’s under load. 
Guidance 
For IO-bound “work”, use await rather than background threads. 
For CPU-bound work, consider using background threads via Parallel.ForEach or Task.Run, 
unless you're writing a library, or scalable server-side code.
Don’t lie
Two ways of thinking about asynchrony 
From the method signature (how people call it) 
• Foo(); 
• Perform something here and now. 
• I’ll regain control to execute something else 
when it’s done. 
• var task = FooAsync(); 
• Initiate something here and now. 
• I’ll regain control to execute something else 
“immediately”. 
From the method implementation (what resources it uses) 
Uses a CPU core solidly while it runs 
void Foo() 
{ 
for (int i=0; i<100; i++) 
Math.Sin(i); 
} 
Hardly touches the CPU 
async Task FooAsync() 
{ 
await client.DownloadAsync(); 
}
Async methods: Your caller ’s assumptions 
“This method’s name ends with ‘Async’, so…” 
“…calling it won’t spawn new threads in my server app” 
“…I can parallelize by simply calling it multiple times” 
Is this true for your async methods?
Libraries generally 
shouldn’t use Task.Run()
Your callers should be the ones to call Task.Run 
“await task;” 
Captures the current SyncContext before awaiting. 
When it resumes, uses SyncContext.Post() to resume “in the same place” 
(If SyncContext is null, uses the TaskScheduler) 
For application-level code: 
This behavior is almost always what you want. 
For library-level code: 
This behavior is rarely what you want!
Sync methods: Your caller ’s assumptions 
“There’s a synchronous version of this method…” 
“…I guess it must be faster than the async version” 
“…I can call it from the UI thread if the latency’s fine” 
void Foo() { FooAsync().Wait(); } -- will deadlock!!!
Library methods shouldn't lie 
Principles 
In a server app, spinning up threads hurts scalabilty. 
The app (not the library) is in the best position to manage its own threads. 
Users will assume they know your method's implementation by looking at its signature. 
Guidance 
Define an async signature “FooAsync” when your implementation is truly async. 
Define a sync signature "Foo" when your implementation is fast and won't deadlock. 
Don't use blocking calls to Wait() .Result in libraries; that invites deadlocks.
Use 
ConfigureAwait(false)
SynchronizationContext 
Represents a target for work via its Post method 
WindowsFormsSynchronizationContext 
.Post() does Control.BeginInvoke 
DispatcherSynchronizationContext 
.Post() does Dispatcher.BeginInvoke 
AspNetSynchronizationContext 
.Post() ensures one-at-a-time 
… // ~10 in .NET Framework, and you can write your own 
… // Is the core way for “await” to know how to put you back
SynchronizationContext and Await 
Principles 
In a server app, spinning up threads hurts scalabilty. 
The app (not the library) is in the best position to manage its own threads. 
Users will assume they know your method's implementation by looking at its signature. 
Guidance 
Define an async signature “FooAsync” when your implementation is truly async. 
Define a sync signature "Foo" when your implementation is fast and won't deadlock. 
Don't use blocking calls to Wait() .Result in libraries; that invites deadlocks.
SynchronizationContext: ConfigureAwait 
Task.ConfigureAwait(bool continueOnCapturedContext) 
await t.ConfigureAwait(true) // default 
Post continuation back to the current context/scheduler 
await t.ConfigureAwait(false) 
If possible, continue executing where awaited task completes 
Implications 
Performance (avoids unnecessary thread marshaling) 
Deadlock (code shouldn’t block UI thread, but avoids deadlocks if it does)
Use ConfigureAwait(false) 
Principles 
SynchronizationContext is captured before an await, and used to resume from await. 
In a library, this is an unnecessary perf hit. 
It can also lead to deadlocks if the user (incorrectly) calls Wait() on your returned Task.. 
Guidance 
In library methods, use "await t.ConfigureAwait(false);"
Await all the way
Library perf considerations 
Principles 
The compiler provides, through the await keyword, sequential execution of the code. 
Guidance 
Don’t mix async-await with ContinuesWith.
Task.Run is the way to 
create new tasks
Task.Run 
Principles 
Task.Run returns hot tasks (running or completed) created with settings suited to async-await. 
Guidance 
Don’t use Task.Factory.StartNew or the Task (or Task<T>) constructor.
Use the 
CancellationToken
Use the CancellationToken 
Principles 
The CancellationToken structure is the way to signal and handle cancellation. 
Guidance 
If you want your API to be cancellable, use cancellation tokens. 
If your code uses APIs that use cancellation tokens, use them. 
Always check the cancellation tokens.
Library perf 
considerations
Library perf considerations 
Principles 
Async methods are faster than what you could write manually, but still slower than synchronous. 
The chief cost is in memory allocation (actually, in garbage collection). 
The "fast path" bypasses some allocations. 
Guidance 
Avoid designing "chatty" APIs where async methods are called in an inner loop; make them "chunky". 
If necessary, cache the returned Task object (even with cache size "1"), for zero allocations per call. 
As always, don't prematurely optimize!
Questões?
Resources 
• Talk: Async best practices 
– http://blogs.msdn.com/b/lucian/archive/2013/11/23/talk-mvp-summit-async-best-practices.aspx 
• Six Essential Tips For Async – Introduction 
– http://channel9.msdn.com/Series/Three-Essential-Tips-for-Async/Three-Essential-Tips-For-Async- 
Introduction 
• Curah! async-await General 
– http://curah.microsoft.com/45553/asyncawait-general 
• Curah! async-await and ASP.NET 
– http://curah.microsoft.com/44400/async-and-aspnet
Questões?
Patrocinadores “GOLD” 
Twitter: @PTMicrosoft 
http://www.microsoft.com/portugal 
Twitter: @FindMoreC 
http://www.findmore.eu
Patrocinadores “Silver”
Patrocinadores “Bronze”
http://bit.ly/netponto-aval-50 
* Para quem não puder preencher durante a reunião, 
iremos enviar um email com o link à tarde
Próximas reuniões presenciais 
22/11/2014 – Novembro – 50ª Reunião! (Lisboa) 
13/12/2014 – Dezembro (Lisboa) 
24/01/2015 – Janeiro (Lisboa) 
??/??/2015 – ????? (Porto) 
??/??/2015 – ????? (?????) 
Reserva estes dias na agenda! :)

More Related Content

What's hot

What's hot (20)

Spring Framework
Spring FrameworkSpring Framework
Spring Framework
 
Nifi workshop
Nifi workshopNifi workshop
Nifi workshop
 
Spring Framework - AOP
Spring Framework - AOPSpring Framework - AOP
Spring Framework - AOP
 
Java Persistence API (JPA) Step By Step
Java Persistence API (JPA) Step By StepJava Persistence API (JPA) Step By Step
Java Persistence API (JPA) Step By Step
 
React Native Workshop
React Native WorkshopReact Native Workshop
React Native Workshop
 
JavaScript Promises
JavaScript PromisesJavaScript Promises
JavaScript Promises
 
Apache kafka
Apache kafkaApache kafka
Apache kafka
 
Tomcat Server
Tomcat ServerTomcat Server
Tomcat Server
 
Servicenow connector
Servicenow connectorServicenow connector
Servicenow connector
 
Azure automation
Azure automationAzure automation
Azure automation
 
Microservices
Microservices Microservices
Microservices
 
Terraform 0.12 + Terragrunt
Terraform 0.12 + TerragruntTerraform 0.12 + Terragrunt
Terraform 0.12 + Terragrunt
 
Spring mvc
Spring mvcSpring mvc
Spring mvc
 
Integrating Apache NiFi and Apache Flink
Integrating Apache NiFi and Apache FlinkIntegrating Apache NiFi and Apache Flink
Integrating Apache NiFi and Apache Flink
 
Async API and Solace: Enabling the Event-Driven Future
Async API and Solace: Enabling the Event-Driven FutureAsync API and Solace: Enabling the Event-Driven Future
Async API and Solace: Enabling the Event-Driven Future
 
AI made easy with Flink AI Flow
AI made easy with Flink AI FlowAI made easy with Flink AI Flow
AI made easy with Flink AI Flow
 
Spring aop
Spring aopSpring aop
Spring aop
 
Deep Dive - CI/CD on AWS
Deep Dive - CI/CD on AWSDeep Dive - CI/CD on AWS
Deep Dive - CI/CD on AWS
 
Introduction to Microservices
Introduction to MicroservicesIntroduction to Microservices
Introduction to Microservices
 
Github Actions and Terraform.pdf
Github Actions and Terraform.pdfGithub Actions and Terraform.pdf
Github Actions and Terraform.pdf
 

Similar to Async-await best practices in 10 minutes

Task parallel library presentation
Task parallel library presentationTask parallel library presentation
Task parallel library presentationahmed sayed
 
Introduction to Python Asyncio
Introduction to Python AsyncioIntroduction to Python Asyncio
Introduction to Python AsyncioNathan Van Gheem
 
PyCon Canada 2019 - Introduction to Asynchronous Programming
PyCon Canada 2019 - Introduction to Asynchronous ProgrammingPyCon Canada 2019 - Introduction to Asynchronous Programming
PyCon Canada 2019 - Introduction to Asynchronous ProgrammingJuti Noppornpitak
 
Async programming and python
Async programming and pythonAsync programming and python
Async programming and pythonChetan Giridhar
 
End to-end async and await
End to-end async and awaitEnd to-end async and await
End to-end async and awaitvfabro
 
Presentation: Everything you wanted to know about writing async, high-concurr...
Presentation: Everything you wanted to know about writing async, high-concurr...Presentation: Everything you wanted to know about writing async, high-concurr...
Presentation: Everything you wanted to know about writing async, high-concurr...Baruch Sadogursky
 
Asynchronous Programming.pptx
Asynchronous Programming.pptxAsynchronous Programming.pptx
Asynchronous Programming.pptxAayush Chimaniya
 
HOW TO DEAL WITH BLOCKING CODE WITHIN ASYNCIO EVENT LOOP
HOW TO DEAL WITH BLOCKING CODE WITHIN ASYNCIO EVENT LOOPHOW TO DEAL WITH BLOCKING CODE WITHIN ASYNCIO EVENT LOOP
HOW TO DEAL WITH BLOCKING CODE WITHIN ASYNCIO EVENT LOOPMykola Novik
 
BUILDING APPS WITH ASYNCIO
BUILDING APPS WITH ASYNCIOBUILDING APPS WITH ASYNCIO
BUILDING APPS WITH ASYNCIOMykola Novik
 
Asynchronous programming - .NET Way
Asynchronous programming - .NET WayAsynchronous programming - .NET Way
Asynchronous programming - .NET WayBishnu Rawal
 
Ratpack Web Framework
Ratpack Web FrameworkRatpack Web Framework
Ratpack Web FrameworkDaniel Woods
 
The art of concurrent programming
The art of concurrent programmingThe art of concurrent programming
The art of concurrent programmingIskren Chernev
 
Everything you wanted to know about writing async, concurrent http apps in java
Everything you wanted to know about writing async, concurrent http apps in java Everything you wanted to know about writing async, concurrent http apps in java
Everything you wanted to know about writing async, concurrent http apps in java Baruch Sadogursky
 
Asynchronous Python A Gentle Introduction
Asynchronous Python A Gentle IntroductionAsynchronous Python A Gentle Introduction
Asynchronous Python A Gentle IntroductionPyData
 
Async Await for Mobile Apps
Async Await for Mobile AppsAsync Await for Mobile Apps
Async Await for Mobile AppsCraig Dunn
 

Similar to Async-await best practices in 10 minutes (20)

Task parallel library presentation
Task parallel library presentationTask parallel library presentation
Task parallel library presentation
 
Training – Going Async
Training – Going AsyncTraining – Going Async
Training – Going Async
 
Introduction to Python Asyncio
Introduction to Python AsyncioIntroduction to Python Asyncio
Introduction to Python Asyncio
 
MultiThreading in Python
MultiThreading in PythonMultiThreading in Python
MultiThreading in Python
 
PyCon Canada 2019 - Introduction to Asynchronous Programming
PyCon Canada 2019 - Introduction to Asynchronous ProgrammingPyCon Canada 2019 - Introduction to Asynchronous Programming
PyCon Canada 2019 - Introduction to Asynchronous Programming
 
Async programming and python
Async programming and pythonAsync programming and python
Async programming and python
 
End to-end async and await
End to-end async and awaitEnd to-end async and await
End to-end async and await
 
Presentation: Everything you wanted to know about writing async, high-concurr...
Presentation: Everything you wanted to know about writing async, high-concurr...Presentation: Everything you wanted to know about writing async, high-concurr...
Presentation: Everything you wanted to know about writing async, high-concurr...
 
Asynchronous Programming.pptx
Asynchronous Programming.pptxAsynchronous Programming.pptx
Asynchronous Programming.pptx
 
HOW TO DEAL WITH BLOCKING CODE WITHIN ASYNCIO EVENT LOOP
HOW TO DEAL WITH BLOCKING CODE WITHIN ASYNCIO EVENT LOOPHOW TO DEAL WITH BLOCKING CODE WITHIN ASYNCIO EVENT LOOP
HOW TO DEAL WITH BLOCKING CODE WITHIN ASYNCIO EVENT LOOP
 
BUILDING APPS WITH ASYNCIO
BUILDING APPS WITH ASYNCIOBUILDING APPS WITH ASYNCIO
BUILDING APPS WITH ASYNCIO
 
Asynchronous programming - .NET Way
Asynchronous programming - .NET WayAsynchronous programming - .NET Way
Asynchronous programming - .NET Way
 
Async programming in c#
Async programming in c#Async programming in c#
Async programming in c#
 
Asynchronyin net
Asynchronyin netAsynchronyin net
Asynchronyin net
 
Ratpack Web Framework
Ratpack Web FrameworkRatpack Web Framework
Ratpack Web Framework
 
The art of concurrent programming
The art of concurrent programmingThe art of concurrent programming
The art of concurrent programming
 
Everything you wanted to know about writing async, concurrent http apps in java
Everything you wanted to know about writing async, concurrent http apps in java Everything you wanted to know about writing async, concurrent http apps in java
Everything you wanted to know about writing async, concurrent http apps in java
 
Asynchronous Python A Gentle Introduction
Asynchronous Python A Gentle IntroductionAsynchronous Python A Gentle Introduction
Asynchronous Python A Gentle Introduction
 
Function as a Service
Function as a ServiceFunction as a Service
Function as a Service
 
Async Await for Mobile Apps
Async Await for Mobile AppsAsync Await for Mobile Apps
Async Await for Mobile Apps
 

More from Paulo Morgado

NetPonto - The Future Of C# - NetConf Edition
NetPonto - The Future Of C# - NetConf EditionNetPonto - The Future Of C# - NetConf Edition
NetPonto - The Future Of C# - NetConf EditionPaulo Morgado
 
Tuga IT 2018 Summer Edition - The Future of C#
Tuga IT 2018 Summer Edition - The Future of C#Tuga IT 2018 Summer Edition - The Future of C#
Tuga IT 2018 Summer Edition - The Future of C#Paulo Morgado
 
Tuga IT 2017 - What's new in C# 7
Tuga IT 2017 - What's new in C# 7Tuga IT 2017 - What's new in C# 7
Tuga IT 2017 - What's new in C# 7Paulo Morgado
 
Tuga it 2016 - What's New In C# 6
Tuga it 2016 - What's New In C# 6Tuga it 2016 - What's New In C# 6
Tuga it 2016 - What's New In C# 6Paulo Morgado
 
What's new in C# 6 - NetPonto Porto 20160116
What's new in C# 6  - NetPonto Porto 20160116What's new in C# 6  - NetPonto Porto 20160116
What's new in C# 6 - NetPonto Porto 20160116Paulo Morgado
 
await Xamarin @ PTXug
await Xamarin @ PTXugawait Xamarin @ PTXug
await Xamarin @ PTXugPaulo Morgado
 
MVP Showcase 2015 - C#
MVP Showcase 2015 - C#MVP Showcase 2015 - C#
MVP Showcase 2015 - C#Paulo Morgado
 
Roslyn analyzers: File->New->Project
Roslyn analyzers: File->New->ProjectRoslyn analyzers: File->New->Project
Roslyn analyzers: File->New->ProjectPaulo Morgado
 
C# 6.0 - April 2014 preview
C# 6.0 - April 2014 previewC# 6.0 - April 2014 preview
C# 6.0 - April 2014 previewPaulo Morgado
 
What’s New In C# 5.0 - iseltech'13
What’s New In C# 5.0 - iseltech'13What’s New In C# 5.0 - iseltech'13
What’s New In C# 5.0 - iseltech'13Paulo Morgado
 
What's New In C# 5.0 - Programar 2013
What's New In C# 5.0 - Programar 2013What's New In C# 5.0 - Programar 2013
What's New In C# 5.0 - Programar 2013Paulo Morgado
 
What's new in c# 5.0 net ponto
What's new in c# 5.0   net pontoWhat's new in c# 5.0   net ponto
What's new in c# 5.0 net pontoPaulo Morgado
 
What's New In C# 5.0 - Rumos InsideOut
What's New In C# 5.0 - Rumos InsideOutWhat's New In C# 5.0 - Rumos InsideOut
What's New In C# 5.0 - Rumos InsideOutPaulo Morgado
 
Whats New In C# 4 0 - NetPonto
Whats New In C# 4 0 - NetPontoWhats New In C# 4 0 - NetPonto
Whats New In C# 4 0 - NetPontoPaulo Morgado
 
As Novidades Do C# 4.0 - NetPonto
As Novidades Do C# 4.0 - NetPontoAs Novidades Do C# 4.0 - NetPonto
As Novidades Do C# 4.0 - NetPontoPaulo Morgado
 

More from Paulo Morgado (16)

NetPonto - The Future Of C# - NetConf Edition
NetPonto - The Future Of C# - NetConf EditionNetPonto - The Future Of C# - NetConf Edition
NetPonto - The Future Of C# - NetConf Edition
 
Tuga IT 2018 Summer Edition - The Future of C#
Tuga IT 2018 Summer Edition - The Future of C#Tuga IT 2018 Summer Edition - The Future of C#
Tuga IT 2018 Summer Edition - The Future of C#
 
Tuga IT 2017 - What's new in C# 7
Tuga IT 2017 - What's new in C# 7Tuga IT 2017 - What's new in C# 7
Tuga IT 2017 - What's new in C# 7
 
What's New In C# 7
What's New In C# 7What's New In C# 7
What's New In C# 7
 
Tuga it 2016 - What's New In C# 6
Tuga it 2016 - What's New In C# 6Tuga it 2016 - What's New In C# 6
Tuga it 2016 - What's New In C# 6
 
What's new in C# 6 - NetPonto Porto 20160116
What's new in C# 6  - NetPonto Porto 20160116What's new in C# 6  - NetPonto Porto 20160116
What's new in C# 6 - NetPonto Porto 20160116
 
await Xamarin @ PTXug
await Xamarin @ PTXugawait Xamarin @ PTXug
await Xamarin @ PTXug
 
MVP Showcase 2015 - C#
MVP Showcase 2015 - C#MVP Showcase 2015 - C#
MVP Showcase 2015 - C#
 
Roslyn analyzers: File->New->Project
Roslyn analyzers: File->New->ProjectRoslyn analyzers: File->New->Project
Roslyn analyzers: File->New->Project
 
C# 6.0 - April 2014 preview
C# 6.0 - April 2014 previewC# 6.0 - April 2014 preview
C# 6.0 - April 2014 preview
 
What’s New In C# 5.0 - iseltech'13
What’s New In C# 5.0 - iseltech'13What’s New In C# 5.0 - iseltech'13
What’s New In C# 5.0 - iseltech'13
 
What's New In C# 5.0 - Programar 2013
What's New In C# 5.0 - Programar 2013What's New In C# 5.0 - Programar 2013
What's New In C# 5.0 - Programar 2013
 
What's new in c# 5.0 net ponto
What's new in c# 5.0   net pontoWhat's new in c# 5.0   net ponto
What's new in c# 5.0 net ponto
 
What's New In C# 5.0 - Rumos InsideOut
What's New In C# 5.0 - Rumos InsideOutWhat's New In C# 5.0 - Rumos InsideOut
What's New In C# 5.0 - Rumos InsideOut
 
Whats New In C# 4 0 - NetPonto
Whats New In C# 4 0 - NetPontoWhats New In C# 4 0 - NetPonto
Whats New In C# 4 0 - NetPonto
 
As Novidades Do C# 4.0 - NetPonto
As Novidades Do C# 4.0 - NetPontoAs Novidades Do C# 4.0 - NetPonto
As Novidades Do C# 4.0 - NetPonto
 

Async-await best practices in 10 minutes

  • 1. http://netponto.org 50ª Reunião Presencial @ LISBOA DateTime.Parse(“22-11-2014", new CultureInfo("pt-PT")); hashtag #netponto
  • 2. 50ª Reunião Lisboa – 22-11-2014 http://netponto.org Async-Await Best Practinces in 10 minutes Paulo Morgado
  • 3. Paulo Morgado • http://PauloMorgado.NET/ • @PauloMorgado • http://about.me/PauloMorgado • http://www.slideshare.net/PauloJorgeMorgado • http://pontonetpt.org/blogs/paulomorgado/ • http://blogs.msmvps.com/paulomorgado/ • http://weblogs.asp.net/paulomorgado • http://www.revista-programar.info/author/pmorgado/
  • 4. For goodness’ sake, stop using async void!
  • 5. Async void is only for event handlers Principles Async void is a “fire-and-forget” mechanism... The caller is unable to know when an async void has finished The caller is unable to catch exceptions thrown from an async void (instead they get posted to the UI message-loop) Guidance Use async void methods only for top-level event handlers (and their like) Use async Task-returning methods everywhere else If you need fire-and-forget elsewhere, indicate it explicitly e.g. “FredAsync().FireAndForget()” When you see an async lambda, verify it
  • 6. The problem is events. They’re not going away.
  • 7. Async over events • Principles Callback-based programming, as with events, is hard • Guidance If the event-handlers are largely independent, then leave them as events But if they look like a state-machine, then await is sometimes easier To turn events into awaitable Tasks, use TaskCompletionSource
  • 8. Is it CPU-bound, or I/O-bound?
  • 9. Threadpool Principles CPU-bound work means things like: LINQ-over-objects, or big iterations, or computational inner loops. Parallel.ForEach and Task.Run are a good way to put CPU-bound work onto the thread pool. Thread pool will gradually feel out how many threads are needed to make best progress. Use of threads will never increase throughput on a machine that’s under load. Guidance For IO-bound “work”, use await rather than background threads. For CPU-bound work, consider using background threads via Parallel.ForEach or Task.Run, unless you're writing a library, or scalable server-side code.
  • 11. Two ways of thinking about asynchrony From the method signature (how people call it) • Foo(); • Perform something here and now. • I’ll regain control to execute something else when it’s done. • var task = FooAsync(); • Initiate something here and now. • I’ll regain control to execute something else “immediately”. From the method implementation (what resources it uses) Uses a CPU core solidly while it runs void Foo() { for (int i=0; i<100; i++) Math.Sin(i); } Hardly touches the CPU async Task FooAsync() { await client.DownloadAsync(); }
  • 12. Async methods: Your caller ’s assumptions “This method’s name ends with ‘Async’, so…” “…calling it won’t spawn new threads in my server app” “…I can parallelize by simply calling it multiple times” Is this true for your async methods?
  • 14. Your callers should be the ones to call Task.Run “await task;” Captures the current SyncContext before awaiting. When it resumes, uses SyncContext.Post() to resume “in the same place” (If SyncContext is null, uses the TaskScheduler) For application-level code: This behavior is almost always what you want. For library-level code: This behavior is rarely what you want!
  • 15. Sync methods: Your caller ’s assumptions “There’s a synchronous version of this method…” “…I guess it must be faster than the async version” “…I can call it from the UI thread if the latency’s fine” void Foo() { FooAsync().Wait(); } -- will deadlock!!!
  • 16. Library methods shouldn't lie Principles In a server app, spinning up threads hurts scalabilty. The app (not the library) is in the best position to manage its own threads. Users will assume they know your method's implementation by looking at its signature. Guidance Define an async signature “FooAsync” when your implementation is truly async. Define a sync signature "Foo" when your implementation is fast and won't deadlock. Don't use blocking calls to Wait() .Result in libraries; that invites deadlocks.
  • 18. SynchronizationContext Represents a target for work via its Post method WindowsFormsSynchronizationContext .Post() does Control.BeginInvoke DispatcherSynchronizationContext .Post() does Dispatcher.BeginInvoke AspNetSynchronizationContext .Post() ensures one-at-a-time … // ~10 in .NET Framework, and you can write your own … // Is the core way for “await” to know how to put you back
  • 19. SynchronizationContext and Await Principles In a server app, spinning up threads hurts scalabilty. The app (not the library) is in the best position to manage its own threads. Users will assume they know your method's implementation by looking at its signature. Guidance Define an async signature “FooAsync” when your implementation is truly async. Define a sync signature "Foo" when your implementation is fast and won't deadlock. Don't use blocking calls to Wait() .Result in libraries; that invites deadlocks.
  • 20. SynchronizationContext: ConfigureAwait Task.ConfigureAwait(bool continueOnCapturedContext) await t.ConfigureAwait(true) // default Post continuation back to the current context/scheduler await t.ConfigureAwait(false) If possible, continue executing where awaited task completes Implications Performance (avoids unnecessary thread marshaling) Deadlock (code shouldn’t block UI thread, but avoids deadlocks if it does)
  • 21. Use ConfigureAwait(false) Principles SynchronizationContext is captured before an await, and used to resume from await. In a library, this is an unnecessary perf hit. It can also lead to deadlocks if the user (incorrectly) calls Wait() on your returned Task.. Guidance In library methods, use "await t.ConfigureAwait(false);"
  • 23. Library perf considerations Principles The compiler provides, through the await keyword, sequential execution of the code. Guidance Don’t mix async-await with ContinuesWith.
  • 24. Task.Run is the way to create new tasks
  • 25. Task.Run Principles Task.Run returns hot tasks (running or completed) created with settings suited to async-await. Guidance Don’t use Task.Factory.StartNew or the Task (or Task<T>) constructor.
  • 27. Use the CancellationToken Principles The CancellationToken structure is the way to signal and handle cancellation. Guidance If you want your API to be cancellable, use cancellation tokens. If your code uses APIs that use cancellation tokens, use them. Always check the cancellation tokens.
  • 29. Library perf considerations Principles Async methods are faster than what you could write manually, but still slower than synchronous. The chief cost is in memory allocation (actually, in garbage collection). The "fast path" bypasses some allocations. Guidance Avoid designing "chatty" APIs where async methods are called in an inner loop; make them "chunky". If necessary, cache the returned Task object (even with cache size "1"), for zero allocations per call. As always, don't prematurely optimize!
  • 31. Resources • Talk: Async best practices – http://blogs.msdn.com/b/lucian/archive/2013/11/23/talk-mvp-summit-async-best-practices.aspx • Six Essential Tips For Async – Introduction – http://channel9.msdn.com/Series/Three-Essential-Tips-for-Async/Three-Essential-Tips-For-Async- Introduction • Curah! async-await General – http://curah.microsoft.com/45553/asyncawait-general • Curah! async-await and ASP.NET – http://curah.microsoft.com/44400/async-and-aspnet
  • 33. Patrocinadores “GOLD” Twitter: @PTMicrosoft http://www.microsoft.com/portugal Twitter: @FindMoreC http://www.findmore.eu
  • 36. http://bit.ly/netponto-aval-50 * Para quem não puder preencher durante a reunião, iremos enviar um email com o link à tarde
  • 37. Próximas reuniões presenciais 22/11/2014 – Novembro – 50ª Reunião! (Lisboa) 13/12/2014 – Dezembro (Lisboa) 24/01/2015 – Janeiro (Lisboa) ??/??/2015 – ????? (Porto) ??/??/2015 – ????? (?????) Reserva estes dias na agenda! :)

Editor's Notes

  1. * Let me put that more strongly * For goodness' sake, stop using async void everywhere. * (At first that was going to be the title of my talk)
  2. * We've seen that async void is a "fire-and-forget" mechanism * Meaning: the caller is *unable* to know when an async void has finished. * And the caller is *unable* to catch exceptions from an async void method. * Guidance is to use async void solely for top-level event handlers. * Everywhere else in code, like SendData, async methods should return Task. * There’s one other danger, about async void and lambdas, I’ll come to it in a moment.
  3. * Basically the problem boils down to events. * Building a complicated UI that responds to events. * The developer had tried one approach, keeping the event handlers local. * That led to a nightmare of nested lambdas. * We tried another approach, figuring out the game's state machine, and how it responds to events. * That ended up being too global. That is, things that should have been local ended up being part of the global state machine. What should have been local variables were promoted to class fields. * The thing is, we have to figure out a way to tame events. * Events have been with us for a long time, and they'll be with us for a long time to come. * WPF, Silverlight, even in Windows8, they're still full of events. * The challenge is how to tame them. * I'm going to show how we can tame them by wrapping them up with tasks.
  4. * So let's review. * It's vital to distinguish between what is CPU-bound work and what is IO-bound work. * CPU-bound means things like LINQ-to-objects, or iterations, or computationally-intensive inner loops. * Parallel.ForEach and Task.Run are good ways to put these CPU-bound workloads on the threadpool. * But it's important to understand that threads haven't increased scalability * Scalability is about not wasting resources * One of those resources is threads * Let's say your server can handle 1000 threads. * If you had 1 thread per request, then you could handle 1000 requests at a time. * But if you created 2 threads per request, then only 500 requests at a time.
  5. * Oh. Just hold on there a moment. * What the heck kind of deserialization takes so long? 100ms per house? That's an eternity. * Well, I checked with the developer. * Turns out his deserialization wasn't really what I'd call deserialization. * It was looking up tables in a database. * That's why it took so long. It was network-bound, not CPU-bound.
  6. * So let's review. * It's vital to distinguish between what is CPU-bound work and what is IO-bound work. * CPU-bound means things like LINQ-to-objects, or iterations, or computationally-intensive inner loops. * Parallel.ForEach and Task.Run are good ways to put these CPU-bound workloads on the threadpool. * But it's important to understand that threads haven't increased scalability * Scalability is about not wasting resources * One of those resources is threads * Let's say your server can handle 1000 threads. * If you had 1 thread per request, then you could handle 1000 requests at a time. * But if you created 2 threads per request, then only 500 requests at a time.
  7. * We've seen that async void is a "fire-and-forget" mechanism * Meaning: the caller is *unable* to know when an async void has finished. * And the caller is *unable* to catch exceptions from an async void method. * Guidance is to use async void solely for top-level event handlers. * Everywhere else in code, like SendData, async methods should return Task. * There’s one other danger, about async void and lambdas, I’ll come to it in a moment.
  8. * There are two ways of thinking about asynchrony. * First is from the method signature, the contract, how people will call it:* Synchronous signature means people expect to call it, to perform an operation here and now on this thread, and they won't get back control until it's done. * Asynchronous signature means people expect to call it to kick off an operation, but they'll get back control immediately to do whatever else they want. * Second way to look is from how it's actually implemented* An implementation that's synchronous is one that burns up a CPU core, doing work* An asynchronous implementation is one that's lighter, it barely touches the CPU, it just does small bits of work and schedules more. * The thing is, your callers will look at your signature of your method, and they'll make assumptions right or wrong about how you're implemented underneath. * It'll be your job to stay in line with those expectations.
  9. * So what are the assumptions people will make? * Imagine someone comes up to your API. * They're going to read the documentation. * Hah! Who am I kidding? They might read the XML doc-comments if we're lucky. * They're going to say "Hey, here's a method, its name ends with Async, so..."* Maybe I'm in a server app. I bet this method's not going to spawn new threads. * I can trust this method to be a good citizen on my server. [CLICK] * I also know that I can parallelize it. * Maybe it's a download API. I can kick off 10 downloads simultaneously, just by invoking it 10 times and then awaiting Task.WhenAll. * And it's not going to be hurting my scalability to do so. [CLICK] * These are the assumptions people make when they see "Async" at the end of your library method. * What you have to ask yourselves is this: is it actually true for the async methods you're creating?
  10. * We're going to look at an example where this isn't actually true.   [VS] TreadPoolScaling.Run() // uncomment Go To Definition * There are going to be a number of small demos in this talk. * This first one is about what happens when a library uses Task.Run internally. * Simple app, console app, but I'm just spinning up a Winforms dialog here. * This is the minimal code I need to get a UI message-loop. (don't want rest of plumbing)   class IO * It's a demo of a library, so we'll have three layers: the app that uses the library then the library itself then the underlying framework functionality that the library uses. * In this case, just as simulation, my OS provides two forms of its DownloadFile API * One of them's truly synchronous - it really does tie up the thread. Doesn't burn the CPU, but does block the CPU. * The other one's truly asynchronous - not using CPU, not tying up Thread.   class Library * Here's one way to write the library. It wants to offer up an asynchronous API * And in this version, it's using the synchronous OS API. Maybe that's the only one available. * So to become async, my API needs to wrap it, with await Task.Run * It's the top-right quadrant. It looks async, but it's wrapping an implementation that's synchronous. * Probably to avoid blocking the callign thread.   b.Click += async delegate * And here's what the app developer wrote, the user of my library. * They want to be asynchronous, they want to stay responsive. * But say they don't want just one, but they want to download 100 files. * They saw that it was an async method, so they trusted they could just kick off all the tasks and then await Task.WhenAny   [RUN] * Now it has kicked off all 100 of those tasks. * But because each one wants to use a background thread, it's actually going in bursts. * I have four logical cores on this laptop, so the threadpool starts by giving me four threads. As many threads as we have cores. * Then it looks a second later, says it looks like you've made crappy use of those threads, most of them were idle, waiting on IO * So it looks like you need more threads   [RUN] * See the first batch was 4, then next batch was 5, then 6 * The threadpool has this predefined scaling behavior, hill-climbing * So I've had to wait until the threadpool catches up to me, until it eventually finds its optimal number. * But actually my app didn't need any threads. * As an app author, I didn't even think any threads were involved. * That's the key. You don't want to go messing with things that aren't yours, global resources. * And the threadpool is one of those things. * It belongs to the app developer, not to you the library author. * They might have their own ideas about how they want to use the threadpool.   var contents = await IO.DownloadFileAsync() * Now this one's pure async   [RUN] * And this time all 100 files can download at the same time. * This is what we'd expect. * I shouldn't have to block waiting for the threadpool to grow * I just have the assumption that I'm just kicking off work from the UI thread. * You don't want to be a library author who violates that assumption [CLICK] * If your library's using Task.Run, you're putting in roadblocks that prevent the app from using its threads effectively
  11. * So when you do await, it first captures the current SyncContext before awaiting. * When it resumes, it uses SyncContext.Post() to resume "in the same place" as it was before [CLICK] * For app code, this is the behavior that you almost always want. * When you await a download, say, you want to come back and update the UI. [CLICK] * But when you're a library author, it's rarely the behavior that you want. * You usually don't care which threading context you come back on, inside your library methods. * It doesn't matter if your library method finishes off on a different thread either, maybe the IO completion port thread. * That's because when the user awaited on your library method, then their own await is going to put them back where they wanted. They don't need to rely on you.
  12. * Sometimes you think, "well I've implemented FooAsync(). shall I also implement Foo()? Not everybody's going to want to await." * Why don't I just give them a helper, with the same implementation as FooAsync, but with a contract that's synchronous? [CLICK] * User's going to see that there are two versions of your API, one synchronous, one asynchronous. * They'll assume the sync version must be faster than the async, otherwise why else would it be defined? [CLICK] * Also, suppose I'm the UI thread or complicated code where for whatever reason I don't want to await. Or suppose I'm on a constructor or one of those other places where I can't await. Maybe it'll be quick enough just to take 5-10ms to call the sync version from the UI thread, because my domain-specific knowledge knows it'll be okay. [CLICK] * That's what user will think when see that you have both sync and async. * I wrote this sync version at the bottom. It's the wrong thing to do. It violates that assumption. * Worse than that, it will actually DEADLOCK! * That's because the call to Wait blocks the UI thread, and prevents any awaits from resuming. * So, don't do it! Don't ever use .Wait() or .Result in a library method if you might be called on the UI thread.
  13. * Principle is that the threadpool is a global resource. * You as a library developer have to play nice, help the app author use their domain-specific knowledge as to how to create threads. * Don't do it yourself. * Only show an async signature when you r method is truly async. * Don't block in your library calls. * If you block on the UI thread, disaster. * If you block on a threadpool thread, you're hurting the threadpool.
  14. * demo...   [VS] CapturingContext * There's a structure to how I do these perf demos   const int ITERS = 20000; * Repeat inner loop 20,000 times   await t * This one does the defaut - it does capture and resume on the captured synchronization context   await t.ConfigureAwait(false) * This one, same code, just resumes on whichever thread it left off. Likely the threadpool thread.   await Task.WhenAll(WithSyncCtx(), WithoutSyncCtx()); // warm-up * Run each method once to ensure it's JIT'd   [RUN, CTRL+F5] * We see a difference ten-fold difference. * If doing the loop 20,000 times, it adds up to half a second. * That's not much, just a few microseconds. * And it's completely irrelevant if you're only doing 10 or 100 awaits in your library method * But if you have an await inside your inner loop, or if your user will call you inside their inner loop, that's when it adds up [CLICK] * So your habit as library-authors should be: * as a habit, always use ConfigureAwait(false) * There are almost no cases where you have to jump back to the thread where you were
  15. * I need to get technical. Talk about "SynchronizationContext". * It represents a target for work * Has a core method called "Post" * You invoke Post to get back into a particular context. [CLICK] * For example, in Winforms, if you get the current SynchronizationContext and do Post on it, it does a Control.BeginInvoke. That's how Winforms gets onto the UI thread. [CLICK] * And in WPF/Silverlight/Win8 it's similar, the DispatcherSynchronizatonContext. When you do a Post on it, it uses Dispatcher.BeginInvoke. [CLICK] * And ASP.Net current synchronization context, when you do Post() on it, it schedules work to be done in its own way. * There are about 10 in the framework, and you can create more. * And this is the core way that the await keyword knows how to put you back where you were.
  16. * So when you do await, it first captures the current SyncContext before awaiting. * When it resumes, it uses SyncContext.Post() to resume "in the same place" as it was before [CLICK] * For app code, this is the behavior that you almost always want. * When you await a download, say, you want to come back and update the UI. [CLICK] * But when you're a library author, it's rarely the behavior that you want. * You usually don't care which threading context you come back on, inside your library methods. * It doesn't matter if your library method finishes off on a different thread either, maybe the IO completion port thread. * That's because when the user awaited on your library method, then their own await is going to put them back where they wanted. They don't need to rely on you.
  17. * And so in the framework we provide this helper method Task.ConfigureAwait. * Use it on your await operator. * Default, true, means the await should use SynchronizationContext.Post to resume back where it left off * If you pass in false, then if possible it'll skip that and just continue where it is, maybe the IO completion-port thread * Let's just stay there! is as good a place as any! [CLICK] * If your library doesn't do this, and you're using await in an inner loop, then you're waisting the user's message-loop * being a bad citizen, flooding THEIR UI thread with messages that don't have anything to do with them
  18. * So the principle is, you've got to be aware that your library can be called from different environments * SynchronizationContext: think how will your library method run if it's called from the UI context? or from the threadpool context? or ASP.Net? * Guidance: use ConfigureAwait(false): it's good for perf, and also can avoid deadlocks
  19. * We've talked about perf considerations. * Async is as fast as it can be, and the inherent overheads are only noticeable in a tight inner loop. We're talking millions of iterations, not just a few hundred or thousand. * If we can't help, and our API has to be called frequently, there are some great built-in perf features * First, there was the "Fast Path". If an await has already completed, then it just plows right through it. * And if you get to the end of the method without any "slow-path" awaits, then you avoid a bunch of memory allocations. * Guidance is, try to avoid chatty APIs. Make APIs where the consumer of your library doesn't have to await in an inner loop. * You can GetNextKilobyteAsync() instead of GetNextBitAsync(). * If you have to, we saw how to cache the returned Task<T> to remove the one last allocation on the fast path. * Using your domain knowledge that YOU have about the nature of YOUR API can let YOU decide to cache tasks in a way that makes sense.
  20. * Those perf optimizations around await are just there, you'll benefit from them automatically if you use the fast path. * But the one I talked about earlier, about caching the returned Task if it's not one of the common ones, that requires some work on your part.   [VS] ReadAsync() * Here I'm going to show you a typical pattern you can use to cache the returned Task, to avoid having to allocate a new Task object every single time.   byte[] data = new byte[0x10000000[ * I'm going to allocate a quarter of a gig, and measure how many allocations it needs to copy it.   input.CopyToAsync(Stream.Null).Wait(); * For the copying, I'll be using the .NET framework method CopyAsync   int newGen0 = GC.CollectionCount(0) * And I'll be measuring how many times the GC had to run   class MemStream1 : MemoryStream * What I'm testing is two different implementations of MemoryStream   return Read(buffer, offset, count); * My test used Stream.CopyAsync, so I know its going to call into ReadAsync * This first implementation is just a simple async method * no awaits, so it always takes the fast path * But it returns the number of bytes read. * This isn't one of the common values, so it's not a singleton. * Instead it's going to allocate a new Task object every time this is called. * It happens that Stream.CopyAsync is using buffers of size 80k each time, so every Task<int> that it allocates will be Task<int> with value 81920 * But it's still allocating a new copy of that every single time.   private Task<int> m_cachedTask; * Let's look at this second memory-stream implementation. * This one keeps a cache of the last Task<int> it returned. * When Stream.CopyAsync invokes ReadAsync, it knows what value integer it needs to return as a Task.   if (m_cachedTask != null && m_cachedTask.Result == numRead) * And if the Task it's cached has the right value, well, it might as well return that. * A single Task object can be used as many times as you like after it's been completed. * After the Task has completed, it's immutable.   m_cachedTask = Task.FromResult(numRead); * But if the cache wasn't there, or had the wrong value, * then we'll generate an already-completed task with the right value. * That's what Task.FromResult does.   [RUN, CTRL+F5] * And there we see that we've saved an appreciable number of allocations. * I want to stress, it doesn't cache the last INTEGER it returned. * That'd miss the point. Our goal is to reduce the number of Task objects we allocate. * So we have to cache the Task<int>, not just the int. * One thing to ask, how big should our cache be? * Here I've just used a single-element cache. It only stores the previous one. * And what you'll find is that, generally, just a single-element cache works great!   TrackGcs(new MemoryStream(data)); * I just wanted to show you some more perf numbers. * Here I'm going to use the standard built-in MemoryStream   [RUN IT, CTRL+F5] * What we see is that MemoryStream actually has some further internal optimizations to eliminate all GCs in this test. * You can go a long way. It's a question of how much time you want to spend as a library author, and how frequently your library APIs will be used, for how worthwhile it is.   [CLICK] * Using your domain knowledge that YOU have about the nature of YOUR API can let YOU decide to cache tasks in a way that makes sense. * We used it in the .NET framework to dramatically improve the performance of BufferedStream and MemoryStream. [CLICK] * And we saw that it's important to cache the Task<T>, not the T * And a cache size of just "1" is often the right choice!
  21. * We've talked about perf considerations. * Async is as fast as it can be, and the inherent overheads are only noticeable in a tight inner loop. We're talking millions of iterations, not just a few hundred or thousand. * If we can't help, and our API has to be called frequently, there are some great built-in perf features * First, there was the "Fast Path". If an await has already completed, then it just plows right through it. * And if you get to the end of the method without any "slow-path" awaits, then you avoid a bunch of memory allocations. * Guidance is, try to avoid chatty APIs. Make APIs where the consumer of your library doesn't have to await in an inner loop. * You can GetNextKilobyteAsync() instead of GetNextBitAsync(). * If you have to, we saw how to cache the returned Task<T> to remove the one last allocation on the fast path. * Using your domain knowledge that YOU have about the nature of YOUR API can let YOU decide to cache tasks in a way that makes sense.
  22. Telerik Luis abreu Pluralsight Nokia redgate
  23. Para quem puder ir preenchendo, assim não chateio mais logo  É importante para recebermos nós feedback, e para darmos feedback aos nossos oradores http://goqr.me/