Quantcast
Channel: .NET Blog
Viewing all 4000 articles
Browse latest View live

.NET Framework September 2018 Security and Quality Rollup

$
0
0

Today, we are releasing the September 2018 Security and Quality Rollup.

Security

CVE-2018-8421 – Windows Remote Code Execution Vulnerability

This security update resolves a vulnerability in Microsoft .NET Framework that could allow remote code execution when .NET Framework processes untrusted input. An attacker who successfully exploits this vulnerability in software by using .NET Framework could take control of an affected system. The attacker could then install programs; view, change, or delete data; or create new accounts that have full user rights. Users whose accounts are configured to have fewer user rights on the system could be less affected than users who operate with administrative user rights.

CVE-2018-8421

Getting the Update

The Security and Quality Rollup is available via Windows Update, Windows Server Update Services, Microsoft Update Catalog, and Docker.

Microsoft Update Catalog

You can get the update via the Microsoft Update Catalog. For Windows 10, .NET Framework updates are part of the Windows 10 Monthly Rollup.

The following table is for Windows 10 and Windows Server 2016+.

Product Version Security and Quality Rollup KB
Windows 10 1803 (April 2018 Update) Catalog
4457128
.NET Framework 3.5 4457128
.NET Framework 4.7.2 4457128
Windows 10 1709 (Fall Creators Update) Catalog
4457142
.NET Framework 3.5 4457142
.NET Framework 4.7.1, 4.7.2 4457142
Windows 10 1703 (Creators Update) Catalog
4457138
.NET Framework 3.5 4457138
.NET Framework 4.7, 4.7.1, 4.7.2 4457138
Windows 10 1607 (Anniversary Update)
Windows Server 2016
Catalog
4457131
.NET Framework 3.5 4457131
.NET Framework 4.6.2, 4.7, 4.7.1, 4.7.2 4457131
Windows 10 1507 Catalog
4457132
.NET Framework 3.5 4457132
.NET Framework 4.6, 4.6.1, 4.6.2 4457132

The following table is for earlier Windows and Windows Server versions.

Product Version Security and Quality Rollup KB Security Only Update KB
Windows 8.1
Windows RT 8.1
Windows Server 2012 R2
Catalog
4457920
Catalog
4457916
.NET Framework 3.5 4457045 4457056
.NET Framework 4.5.2 4457036 4457028
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457034 4457026
Windows Server 2012 Catalog
4457919
Catalog
4457915
.NET Framework 3.5 4457042 4457053
.NET Framework 4.5.2 4457037 4457029
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457033 4457025
Windows 7
Windows Server 2008 R2
Catalog
4457918
Catalog
4457914
.NET Framework 3.5.1 4457044 4457055
.NET Framework 4.5.2 4457038 4457030
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457035 4457027
Windows Server 2008 Catalog
4457921
Catalog
4457917
.NET Framework 2.0, 3.0 4457043 4457054
.NET Framework 4.5.2 4457038 4457030
.NET Framework 4.6 4457035 4457027

Docker Images

We are updating the following .NET Framework Docker images for today’s release:

Note: Look at the “Tags” view in each repository to see the updated Docker image tags.

Previous Monthly Rollups

The last few .NET Framework Monthly updates are listed below for your convenience:


A (Belated) Welcome to C# 7.3

$
0
0

A (Belated) Welcome to C# 7.3

Better late than never! Some of you may have noticed that C# 7.3 already shipped, back in Visual Studio 2017 update 15.7. Some of you may even be using the features already.

C# 7.3 is the newest point release in the 7.0 family and it continues themes of performance-focused safe code, as well as bringing some small "quality of life" improvements in both new and old features.

For performance, we have a few features that improve ref variables, pointers, and stackalloc. ref variables can now be reassigned, letting you treat ref variables more like traditional variables. stackalloc now has an optional initializer syntax, letting you easily and safely initialize stack allocated buffers. For struct fixed-size buffers, you can now index into the buffer without using a pinning statement. And when you do need to pin, we’ve made the fixed statement more flexible by allowing it to operate on any type that has a suitable GetPinnableReference method.

For feature improvements, we’ve removed some long time restrictions on constraints to System.Enum, System.Delegate, and we’ve added a new unmanaged constraint that allows you to take a pointer to a generic type parameter. We’ve also improved overload resolution (again!), allowed out and pattern variables in more places, enabled tuples to be compared using == and !=, and fixed the [field: ] attribute target for auto-implemented properties to target the property’s backing field.

All of these features are small additions to the language, but they should make each of these parts of the language a little easier and more pleasant. If you want more details, you can see the 15.7 release notes or check out the documentation for What’s new in C# 7.3.

Andy Gocke
C#/VB Compiler Team

Announcing Entity Framework Core 2.2 Preview 2 and the preview of the Cosmos DB provider and spatial extensions for EF Core

$
0
0

Today we are making EF Core 2.2 Preview 2 available, together with a preview of our data provider for Cosmos DB and new spatial extensions for our SQL Server and in-memory providers.

Obtaining the preview

The preview bits are available on NuGet, and also as part of ASP.NET Core 2.2 Preview 2 and the .NET Core SDK 2.2 Preview 2, also releasing today.

If you are working on an application based on ASP.NET Core, we recommend you upgrade to ASP.NET Core 2.2 Preview 2 following the instructions in the announcement.

The SQL Server and the in-memory providers are included in ASP.NET Core, but for other providers and any other type of application, you will need to install the corresponding NuGet package. For example, to add the 2.2 Preview 2 version of the SQL Server provider in a .NET Core library or application from the command line, use:

$ dotnet add package Microsoft.EntityFrameworkCore.SqlServer -v 2.2.0-preview2-35157

Or from the Package Manager Console in Visual Studio:

PM> Install-Package Microsoft.EntityFrameworkCore.SqlServer -Version 2.2.0-preview2-35157

For more details on how to add EF Core to your projects see our documentation on Installing Entity Framework Core.

The Cosmos DB provider and the spatial extensions ship as new separate NuGet packages. We’ll explain how to get started with them in the corresponding feature descriptions.

What is new in this preview?

As we explained in our roadmap annoucement back in June, there will be a large number of bug fixes (you can see the list of issues we have fixed so far here, but only a relatively small number of new features in EF Core 2.2.

Here are the most salient new features:

New EF Core provider for Cosmos DB

This new provider enables developers familiar with the EF programing model to easily target Azure Cosmos DB as an application database, with all the advantages that come with it, including global distribution, elastic scalability, “always on” availability, very low latency, and automatic indexing.

The provider targets the SQL API in Cosmos DB, and can be installed in an application issuing the following command form the command line:

$ dotnet add package Microsoft.EntityFrameworkCore.Cosmos.Sql -v 2.2.0-preview2-35157

Or from the Package Manager Console in Visual Studio:

PM> Install-Package Microsoft.EntityFrameworkCore.Cosmos.Sql -Version 2.2.0-preview2-35157

To configure a DbContext to connect to Cosmos DB, you call the UseCosmosSql() extension method. For example, the following DbContext connects to a database called “MyDocuments” on the Cosmos DB local emulator to store a simple blogging model:

public class BloggingContext : DbContext
{
  public DbSet<Blog> Blogs { get; set; }
  public DbSet<Post> Posts { get; set; }

  protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
  {
    optionsBuilder.UseCosmosSql(
      "https://localhost:8081",
      "C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==",
      "MyDocuments");
  }
}

public class Blog
{
  public int BlogId { get; set; }
  public string Name { get; set; }
  public string Url { get; set; }
  public List<Post> Posts { get; set; }
}

public class Post
{
  public int PostId { get; set; }
  public string Title { get; set; }
  public string Content { get; set; }
}

If you want, you can create the database programmatically, using EF Core APIs:

using (var context = new BloggingContext())
{
  await context.Database.EnsureCreatedAsync();
}

Once you have connected to an existing database and you have defined your entities, you can start storing data in the database, for example:

using (var context = new BloggingContext())
{
  context.Blogs.Add(
    new Blog
    {
        BlogId = 1,
        Name = ".NET Blog",
        Url = "https://blogs.msdn.microsoft.com/dotnet/",
        Posts = new List<Post>
        {
            new Post
            {
                PostId = 2,
                Title = "Welcome to this blog!"
            },
        }
      }
    });
  await context.SaveChangesAsync();
}

And you can write queries using LINQ:

var dotNetBlog = context.Blogs.Single(b => b.Name == ".NET Blog");

Current capabilities and limitations of the Cosmos DB provider

Around a year ago, we started showing similar functionality in demos, using a Cosmos DB provider prototype we put together as a proof of concept. This helped us get some great feedback:

  • Most customers we talked to confirmed that they could see a lot of value in being able to use the EF APIs they were already familiar with to target Cosmos DB, and potentially other NoSQL databases.
  • There were specific details about how the prototype worked, that we needed to fix.
    For example, our prototype mapped entities in each inheritance hierarchy to their own separate Cosmos DB collections, but because of the way Cosmos DB pricing works, this could become unnecessarily expensive. Based on this feedback, we decided to implement a new mapping convention that by default stores all entity types defined in the DbContext in the same Cosmos DB collection, and uses a discriminator property to identify the entity type.

The preview we are releasing today, although limited in many ways, is no longer a prototype, but the actual code we plan on keeping working on and eventually shipping. Our hope is that by releasing it early in development, we will enable many developers to play with it and provide more valuable feedback.

Here are some of the known limitations we are working on overcoming for Preview 3 and RTM.

  • No asynchronous query support: Currently, LINQ queries can only be executed synchronously.
  • Only some of the LINQ operators translatable to Cosmos DB’s SQL dialect are currently translated.
  • No synchronous API support for SaveChanges(), EnsureCreated() or EsureDeleted(): you can use the asynchronous versions.
  • No auto-generated unique keys: Since entities of all types share the same collection, each entity needs to have a globally unique key value, but in Preview 2, if you use an integer Id key, you will need to set it explicitly to unique values on each added entity. This has been addressed, and in our nightly builds we now automatically generate GUID values.
  • No nesting of owned entities in documents: We are planning to use entity ownership to decide when an entity should be serialized as part of the same JSON document as the owner. In fact we are extending the ability to specify ownership to collections in 2.2. However this behavior hasn’t been implemented yet and each entity is stored as its own document.

You can track in more detail our progress overcoming these and other limitations in this issue on GitHub.

For anything else that you find, please report it as a new issue.

Spatial extensions for SQL Server and in-memory

Support for exposing the spatial capabilities of databases through the mapping of spatial columns and functions is a long-standing and popular feature request for EF Core. In fact, some of this functionality has been available to you for some time if you use the EF Core provider for PostgreSQL, Npgsql. In EF Core 2.2, we are finally attempting to address this for the providers that we ship.

Our implementation picks the same NetTopologySuite library that the PostgreSQL provider uses as the source of spatial .NET types you can use in your entity properties. NetTopologySuite is a database-agnostic spatial library that implements standard spatial functionality using .NET idioms like properties and indexers.

The extension then adds the ability to map and convert instances of these types to the column types supported by the underlying database, and usage of methods defined on these types in LINQ queries, to SQL functions supported by the underlying database.

You can install the spatial extension using the following command form the command line:

$ dotnet add package Microsoft.EntityFrameworkCore.SqlServer.NetTopologySuite -v 2.2.0-preview2-35157

Or from the Package Manager Console in Visual Studio:

PM> Install-Package Microsoft.EntityFrameworkCore.SqlServer.NetTopologySuite -Version 2.2.0-preview2-35157

Once you have installed this extension, you can enable it in your DbContext by calling the UseNetTopologySuite() method inside UseSqlServer() either in OnConfiguring() or AddDbContext().

For example:

public class SensorContext : DbContext
{
  public DbSet<Measurement> Measurements { get; set; }

  protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
  {
    optionsBuilder
      .UseSqlServer(
        @"Server=(localdb)\mssqllocaldb;Database=SensorDatabase;Trusted_Connection=True;ConnectRetryCount=0",
        sqlOptions => sqlOptions.UseNetTopologySuite())
  }
}

Then you can start using spatial types in your model definition. In this case, we will use NetTopologySuite.Geometries.Point to represent the location of a measurement:

using NetTopologySuite.Geometries;
...
  public class Measurement
  {
      public int Id { get; set; }
      public DateTime Time { get; set; }
      public Point Location { get; set; }
      public double Temperature { get; set; }
  }

Once you have configured the DbContext and the model in this way, you can create the database, and start persisting spatial data:

using (var context = new SensorContext())
{
  context.Database.EnsureCreated();
  context.AddRange(
    new Measurement { Time = DateTime.Now, Location = new Point(0, 0), Temperature = 0.0},
    new Measurement { Time = DateTime.Now, Location = new Point(1, 1), Temperature = 0.1},
    new Measurement { Time = DateTime.Now, Location = new Point(1, 2), Temperature = 0.2},
    new Measurement { Time = DateTime.Now, Location = new Point(2, 1), Temperature = 0.3},
    new Measurement { Time = DateTime.Now, Location = new Point(2, 2), Temperature = 0.4});
  context.SaveChanges();
}

And once you have a database containing spatial data, you can start executing queries:

var currentLocation = new Point(0, 0);

var nearestMesurements =
  from m in context.Measurements
  where m.Location.Distance(currentLocation) < 2
  orderby m.Location.Distance(currentLocation) descending
  select m;

foreach (var m in nearestMeasurements)
{
    Console.WriteLine($"A temperature of {m.Temperature} was detected on {m.Time} at {m.Location}.");
}

This will result in the following SQL query being executed:

SELECT [m].[Id], [m].[Location], [m].[Temperature], [m].[Time]
FROM [Measurements] AS [m]
WHERE [m].[Location].STDistance(@__currentLocation_0) < 2.0E0
ORDER BY [m].[Location].STDistance(@__currentLocation_0) DESC

Current capabilities and limitations of the spatial extensions

  • It is possible to map properties of concrete types from NetTopologySuite.Geometries such as Geometry, Point, or Polygon, or interfaces from GeoAPI.Geometries, such as IGeometry, IPoint, IPolygon, etc.
  • Only SQL Server and in-memory database are supported: For in-memory it is not necessary to call UseNetTopologySuite(). SQLite will be enabled in Preview 3 via SpatiaLite.
  • EF Core Migrations does not scaffold spatial types correctly, so you currently cannot use Migrations to create the database schema or apply seed data without workarounds.
  • Mapping to Geography columns doesn’t completely work yet.
  • You may find warnings saying that value converters are used. These can be ignored.
  • Reverse engineering a table that contains spatial columns into an entity isn’t supported yet.

For anything else that you find, please report it as a new issue.

Collections of owned entities

EF Core 2.2 extends the ability to express ownership relationships to one-to-many associations. This helps constraining how entities in an owned collection can be manipulated (for example, they cannot be used without an owner) and triggers automatic behaviors such as implicit eager loading. In the case of relational database, owned collections are mapped to separate tables from the owner, just like regular one-to-many associations, but in the case of a document-oriented database such as Cosmos DB, we plan to nest owned entities (in owned collections or references) within the same JSON document as the owner.

You can use the feature by invoking the new OwnsMany() API:

modelBuilder.Entity<Customer>().OwnsMany(c => c.Addresses);

Query tags

This feature is designed to facilitate the correlation of LINQ queries in code with the corresponding generated SQL output captured in logs.

To take advantage of the feature, you annotate a query using the new WithTag() API in a LINQ query. Using the spatial query from the previous example:

var nearestMesurements =
    from m in context.Measurements.WithTag(@"This is my spatial query!")
    where m.Location.Distance(currentLocation) < 2.5
    orderby m.Location.Distance(currentLocation) descending
    select m;

This will generate the following SQL output:

-- EFCore: (#This is my spatial query!)
SELECT [m].[Id], [m].[Location], [m].[Temperature], [m].[Time]
FROM [Measurements] AS [m]
WHERE [m].[Location].STDistance(@__currentLocation_0) < 2.5E0
ORDER BY [m].[Location].STDistance(@__currentLocation_0) DESC

Provider compatibility

Although we have setup testing to make sure that existing providers will continue to work with EF Core 2.2, there might be unexpected problems, and we welcome users and provider writers to report compatibility issues on our issue tracker.

What comes next?

We are still working in some additional features we would like to include in EF Core 2.2, like reverse engineering of database views into query types, support for spatial types with SQLite, as well as additional bug fixes. We are planning on releasing EF Core 2.2 in the last calendar quarter of 2018.

In the meantime, our team has started working on the our next major release, EF Core 3.0, which will include, among other improvements, a significant overhaul of our LINQ implementation.

We will also soon start the work to make Entity Framework 6 compatible with .NET Core 3.0, which was announced last may.

Your feedback is really needed!

We encourage you to play with the new features, and we thank you in advance for posting any feedback to our issue tracker.

The spatial extensions and the Cosmos DB provider in particular are very large features that expose a lot of new capabilities and APIs. Really being able to ship these features as part of EF Core 2.2 RTM is going to depend on your valuable feedback and on our ability to use it to iterate over the design in the next few months.

Announcing ML.NET 0.5

$
0
0

Today, coinciding with the .NET Conf 2018, we’re announcing the release of ML.NET 0.5. It’s been a few months already since we released ML.NET 0.1 at //Build 2018, a cross-platform, open source machine learning framework for .NET developers. While we’re evolving through new preview releases, we are getting great feedback and would like to thank the community for your engagement as we continue to develop ML.NET together in the open.

In this 0.5 release we are adding TensorFlow model scoring as a transform to ML.NET. This enables using an existing TensorFlow model within an ML.NET experiment. In addition we are also addressing a variety of issues and feedback we received from the community. We welcome feedback and contributions to the conversation: relevant issues can be found here.

As part of the upcoming road in ML.NET, we really want your feedback on making ML.NET easier to use. We are working on a new ML.NET API which improves flexibility and ease of use. When the new API is ready and good enough, we plan to deprecate the current LearningPipeline API. Because this will be a significant change we are sharing our proposals for the multiple API options and comparisons at the end of this blog post. We also want an open discussion where you can provide feedback and help shape the long-term API for ML.NET.

This blog post provides details about the following topics in ML.NET:

Added a TensorFlow model scoring transform (TensorFlowTransform)

TensorFlow is a popular deep learning and machine learning toolkit that enables training deep neural networks (and general numeric computations).

Deep learning is a subset of AI and machine learning that teaches programs to do what comes naturally to humans: learn by example.
Its main differentiator compared to traditional machine learning is that a deep learning model can learn to perform object detection and classification tasks directly from images, sound or text, or even deliver tasks such as speech recognition and language translation, whereas traditional ML approaches relied heavily on feature engineering and data processing.
Deep learning models need to be trained by using very large sets of labeled data and neural networks that contain multiple layers. Its current popularity is caused by several reasons. First, it just performs better on some tasks like Computer Vision and second because it can take advantage of huge amounts of data (and requires that volume in order to perform well) that are nowadays becoming available.

With ML.NET 0.5 we are starting to add support for Deep Learning in ML.NET. Today we are introducing the first level of integration with TensorFlow in ML.NET through the new TensorFlowTransform which enables taking an existing TensorFlow model, either trained by you or downloaded from somewhere else, and get the scores from the TensorFlow model in ML.NET.

This new TensorFlow scoring capability doesn’t require you to have a working knowledge of TensorFlow internal details. Longer term we will be working on making the experience for performing Deep Learning with ML.NET even easier.

The implementation of this transform is based on code from TensorFlowSharp.

As shown in the following diagram, you simply add a reference to the ML.NET NuGet packages in your .NET Core or .NET Framework apps. Under the covers, ML.NET includes and references the native TensorFlow library which allows you to write code that loads an existing trained TensorFlow model file for scoring.

TensorFlow-ML.NET application diagram

The following code snippet shows how to use the TensorFlow transform in the ML.NET pipeline:

// ... Additional transformations in the pipeline code

pipeline.Add(new TensorFlowScorer()
{
    ModelFile = "model/tensorflow_inception_graph.pb",   // Example using the Inception v3 TensorFlow model
    InputColumns = new[] { "input" },                    // Name of input in the TensorFlow model
    OutputColumn = "softmax2_pre_activation"             // Name of output in the TensorFlow model
});

// ... Additional code specifying a learner and training process for the ML.NET model

You can find here the complete code example related to the above code-snippet using the TensorFlowTransform, the TensorFlow Inception v3 model and the existing LearningPipeline API.

The code example above uses the pre-trained TensorFlow model named Inception v3, that you can download from here. The Inception v3 is a very popular image recognition model trained on the ImageNet dataset where the TensorFlow model tries to classify entire images into a thousand classes, like “Umbrella”, “Jersey”, and “Dishwasher”.

The Inception v3 model can be classified as a deep convolutional neural network and can achieve reasonable performance on hard visual recognition tasks, matching or exceeding human performance in some domains. The model/algorithm was developed by multiple researchers and based on the original paper: “Rethinking the Inception Architecture for Computer Vision” by Szegedy, et. al.

In the next ML.NET releases, we will add functionality to enable identifying the expected inputs and outputs of TensorFlow models. For now, use the TensorFlow APIs or a tool like Netron to explore the TensorFlow model.

If you open the previous sample TensorFlow model file (tensorflow_inception_graph.pb) with Netron and explore the model’s graph, you can see how it correlates the InputColumn with the node’s input at the beginning of the graph:

TensorFlow model's input in graph

And how the OutputColumn correlates with softmax2_pre_activation node’s output almost at the end of the graph.

TensorFlow model's input in graph

Limitations: We are currently updating the ML.NET APIs for improved flexibility, as there are a few limitations to use TensorFlow in ML.NET today. For now (when using the LearningPipeline API), these scores can only be used within a LearningPipeline as inputs (numeric vectors) to a learner like a classifier learner. However, with the upcoming new ML.NET APIs, the TensorFlow model scores will be directly accessible, so you score with the TensorFlow model without the current need to add an additional learner and its related train process as implemented in this sample. It creates a multi-class classification ML.NET model based on a StochasticDualCoordinateAscentClassifier using a label (object name) related to a numeric vector feature generated/scored per image file by the TensorFlow model.

Take into account that the mentioned TensorFlow code examples using ML.NET are using the current LearningPipeline API available in v0.5. Moving forward, the ML.NET API enabling to use TensorFlow will be slightly different and not based on the “pipeline”. This is related to the next section of this blog post which focuses on the new upcoming API for ML.NET.

Finally, we also want to highlight that the ML.NET framework is currently surfacing TensorFlow, but in the future we might look into additional Deep Learning library integrations, such as Torch and CNTK.

You can find an additional code example/test using the TensorFlowTransform with the existing LearningPipeline API here.

Explore the upcoming new ML.NET API (after 0.5) and provide feedback

As mentioned at the beginning of this blog post, we are really looking forward to get your feedback as we create the new ML.NET API while crafting ML.NET. This evolution in ML.NET offers more flexible capabilities than what the current LearningPipeline API offers. The LearningPipeline API will be deprecated when this new API is ready and good enough.

The following links to some example feedback we got in the form of GitHub issues about the limitations when using the LearningPipeline API:

Therefore, based on feedback on the LearningPipeline API, quite a few weeks ago we decided to switch to a new ML.NET API that would address most of the limitations the LearningPipeline API currently has.

Design principles for this new ML.NET API

We are designing this new API based on the following principles of :

  • Using parallel terminology with other well-known frameworks like Scikit-Learn, TensorFlow and Spark and we will try to be consistent in terms of naming and concepts making it easier for developers to understand and learn ML.NET Core.

  • Keeping simple and concise ML scenarios such as simple train and predict.

  • Allowing advanced ML scenarios (not possible with the current LearningPipeline API as explained in the next section).

We have also explored API approaches like Fluent API, declarative, and imperative.
For additional deeper discussion on principles and required scenarios, check out this issue in GitHub.

Why ML.NET is switching from the LearningPipeline API to a new API?

As part of the preview version crafting process (remember that ML.NET is still in early previews), we’ve been getting LearningPipeline API feedback and discovered quite a few limitations we need to address by creating a more flexible API.

Specifically, the new ML.NET API offers attractive features which aren’t possible with the current LearningPipeline API:

  • Strongly-typed API: This new Strongly-typed API takes advantage of C# capabilities so errors can be discovered in compilation time along with improved Intellisense in the editors.

  • Better flexibility: This API provides a decomposable train and predict process, eliminating rigid and linear pipeline execution. With the new API, execute a certain code path and then fork the execution so multiple paths can re-use the initial common execution. For example, share a given transforms’ execution and transformed data with multiple learners and trainers, or decompose pipelines and add multiple learners.

This new API is based on concepts such as Estimators, Transforms and DataView, shown in the following code in this blog post.

  • Improved usability: Direct call to the APIs from your code, no more scaffolding or insolation layer creating an obscure separation between what the user/developer writes and the internal APIs. Entrypoints are no longer mandatory.

  • Ability to simply score with TensorFlow models. Thanks to the mentioned flexibility in the API, you can also simply load a TensorFlow model and score by using it without needing to add any additional learner and training process, as explained in the previous “Limitations” topic within the TensorFlow section.

  • Better visibility of the transformed data: You have better visibility of the data while applying transformers.

Comparison of strongly-typed API vs. LearningPipeline API

Another important comparison is related to the Strongly Typed API feature in the new API.
As an example of issues you can get when you don’t have strongly typed API, the LearningPipeline API (as illustrated in the following code) provides access to data columns by specifying the comlumn’s names as strings, so if you make a typo (i.e. you wrote “Descrption” without the ‘i’ instead of “Description”, as the typo in the sample code), you will get a run-time exception:

pipeline.Add(new TextFeaturizer("Description", "Descrption"));       

However, when using the new ML.NET API, it is strongly typed, so if you make a typo, it will be caught in compilation time plus you can also take advatage of Intellisense in the editor.

var estimator = reader.MakeEstimator()
                .Append(row => (                    
                    description: row.description.FeaturizeText()))          

Details on decomposable train and predict API

The following code snippet shows how the transforms and training process of the “GitHub issues labeler” sample app can be implemented with the new API in ML.NET.

This is our current proposal and based on your feedback this API will probably evolve accordingly.

New ML.NET API code example:

public static async Task BuildAndTrainModelToClassifyGithubIssues()
{
    var env = new MLEnvironment();

    string trainDataPath = @"Data\issues_train.tsv";

    // Create reader
    var reader = TextLoader.CreateReader(env, ctx =>
                                    (area: ctx.LoadText(1),
                                    title: ctx.LoadText(2),
                                    description: ctx.LoadText(3)),
                                    new MultiFileSource(trainDataPath), 
                                    hasHeader : true);

    var loss = new HingeLoss(new HingeLoss.Arguments() { Margin = 1 });

    var estimator = reader.MakeNewEstimator()
        .Append(row => (
            // Convert string label to key. 
            label: row.area.ToKey(),
            // Featurize 'description'
            description: row.description.FeaturizeText(),
            // Featurize 'title'
            title: row.title.FeaturizeText()))
        .Append(row => (
            // Concatenate the two features into a vector and normalize.
            features: row.description.ConcatWith(row.title).Normalize(),
            // Preserve the label - otherwise it will be dropped
            label: row.label))
        .Append(row => (
            // Preserve the label (for evaluation)
            row.label,
            // Train the linear predictor (SDCA)
            score: row.label.PredictSdcaClassification(row.features, loss: loss)))
        .Append(row => (
            // Want the prediction, as well as label and score which are needed for evaluation
            predictedLabel: row.score.predictedLabel.ToValue(),
            row.label,
            row.score));

    // Read the data
    var data = reader.Read(new MultiFileSource(trainDataPath));

    // Fit the data to get a model
    var model = estimator.Fit(data);

    // Use the model to get predictions on the test dataset and evaluate the accuracy of the model
    var scores = model.Transform(reader.Read(new MultiFileSource(@"Data\issues_test.tsv")));
    var metrics = MultiClassClassifierEvaluator.Evaluate(scores, r => r.label, r => r.score);

    Console.WriteLine("Micro-accuracy is: " + metrics.AccuracyMicro);

    // Save the ML.NET model into a .ZIP file
    await model.WriteAsync("github-Model.zip");
}

public static async Task PredictLableForGithubIssueAsync()
{
    // Read model from an ML.NET .ZIP model file
    var model = await PredictionModel.ReadAsync("github-Model.zip");

    // Create a prediction function that can be used to score incoming issues
    var predictor = model.AsDynamic.MakePredictionFunction<GitHubIssue, IssuePrediction>(env);

    // This prediction will classify this particular issue in a type such as "EF and Database access"
    var prediction = predictor.Predict(new GitHubIssue
    {
        title = "Sample issue related to Entity Framework",
        description = @"When using Entity Framework Core I'm experiencing database connection failures when running queries or transactions. Looks like it could be related to transient faults in network communication agains the Azure SQL Database."
    });

    Console.WriteLine("Predicted label is: " + prediction.predictedLabel);
}

Compare with the following old LearningPipeline API code snippet that lacks flexibility because the pipeline execution is not decomposable but linear:

Old LearningPipeline API code example:

public static async Task BuildAndTrainModelToClassifyGithubIssuesAsync()
{
        // Create the pipeline
    var pipeline = new LearningPipeline();

    // Read the data
    pipeline.Add(new TextLoader(DataPath).CreateFrom<GitHubIssue>(useHeader: true));

    // Dictionarize the "Area" column
    pipeline.Add(new Dictionarizer(("Area", "Label")));

    // Featurize the "Title" column
    pipeline.Add(new TextFeaturizer("Title", "Title"));

    // Featurize the "Description" column
    pipeline.Add(new TextFeaturizer("Description", "Description"));
    
    // Concatenate the provided columns
    pipeline.Add(new ColumnConcatenator("Features", "Title", "Description"));

    // Set the algorithm/learner to use when training
    pipeline.Add(new StochasticDualCoordinateAscentClassifier());

    // Specify the column to predict when scoring
    pipeline.Add(new PredictedLabelColumnOriginalValueConverter() { PredictedLabelColumn = "PredictedLabel" });

    Console.WriteLine("=============== Training model ===============");

    // Train the model
    var model = pipeline.Train<GitHubIssue, GitHubIssuePrediction>();

    // Save the model to a .zip file
    await model.WriteAsync(ModelPath);

    Console.WriteLine("=============== End training ===============");
    Console.WriteLine("The model is saved to {0}", ModelPath);
}

public static async Task<string> PredictLabelForGitHubIssueAsync()
{
    // Read model from an ML.NET .ZIP model file
    _model = await PredictionModel.ReadAsync<GitHubIssue, GitHubIssuePrediction>(ModelPath);
    
    // This prediction will classify this particular issue in a type such as "EF and Database access"
    var prediction = _model.Predict(new GitHubIssue
        {
            Title = "Sample issue related to Entity Framework", 
            Description = "When using Entity Framework Core I'm experiencing database connection failures when running queries or transactions. Looks like it could be related to transient faults in network communication agains the Azure SQL Database..."
        });

    return prediction.Area;
}

The old LearningPipeline API is a fully linear code path, so you can’t decompose it in multiple pieces.
For instance, the BikeSharing ML.NET sample (available at the machine-learning-samples GitHub repo) is using the current LearningPipeline API.

This sample compares the regression learner accuracy using the evaluators API by:

  • Performing several data transforms to the original dataset
  • Training and creating seven different ML.NET models based on seven different regression trainers/algorithms (such as FastTreeRegressor, FastTreeTweedieRegressor, StochasticDualCoordinateAscentRegressor, etc.)

The intent is to help you compare the regression learners for a given problem.

Since the data transformations are the same for those models, you might want to re-use the code execution related to transforms. However, because the the LearningPipeline API only provides a single linear execution, you need to run the same data transformation steps for every model you create/train, as shown in the following code excerpt from the BikeSharing ML.NET sample.

var fastTreeModel = new ModelBuilder(trainingDataLocation, new FastTreeRegressor()).BuildAndTrain();
var fastTreeMetrics = modelEvaluator.Evaluate(fastTreeModel, testDataLocation);
PrintMetrics("Fast Tree", fastTreeMetrics);

var fastForestModel = new ModelBuilder(trainingDataLocation, new FastForestRegressor()).BuildAndTrain();
var fastForestMetrics = modelEvaluator.Evaluate(fastForestModel, testDataLocation);
PrintMetrics("Fast Forest", fastForestMetrics);

var poissonModel = new ModelBuilder(trainingDataLocation, new PoissonRegressor()).BuildAndTrain();
var poissonMetrics = modelEvaluator.Evaluate(poissonModel, testDataLocation);
PrintMetrics("Poisson", poissonMetrics);

//Other learners/algorithms
//...

Where the BuildAndTrain() method needs to have both data transforms plus the different algorithm per case, as shown in the following code:

public PredictionModel<BikeSharingDemandSample, BikeSharingDemandPrediction> BuildAndTrain()
{
    var pipeline = new LearningPipeline();
    pipeline.Add(new TextLoader(_trainingDataLocation).CreateFrom<BikeSharingDemandSample>(useHeader: true, separator: ','));
    pipeline.Add(new ColumnCopier(("Count", "Label")));
    pipeline.Add(new ColumnConcatenator("Features", 
                                        "Season", 
                                        "Year", 
                                        "Month", 
                                        "Hour", 
                                        "Weekday", 
                                        "Weather", 
                                        "Temperature", 
                                        "NormalizedTemperature",
                                        "Humidity",
                                        "Windspeed"));
    pipeline.Add(_algorythm);

    return pipeline.Train<BikeSharingDemandSample, BikeSharingDemandPrediction>();
}            

With the old LearningPipeline API, for every training using a different algorithm you need to run again the same process, performing the following steps again and again:

  • Load dataset from file
  • Make column transformations (concat, copy, or additional featurizers or dictionarizers, if needed)

But with the new ML.NET API based on Estimators and DataView you will be able to re-use parts of the execution, like in this case, re-using the data transforms execution as the base for multiple models using different algorithms.

You can also explore other “aspirational code examples” with the new API here

Because this will be a significant change in ML.NET we want to share our proposals and start an open discussion with you where you can provide your feedback and help shape the long-term API for ML.NET.

Provide your feedback on the new API

Provide feedback image with two people and a swimlane

Want to get involved? Start by providing feedback at this blog post comments below or through issues at the ML.NET GitHub repo

Get started!

If you haven’t already, get started with ML.NET here!

Next, explore some other great resources:

We look forward to your feedback and welcome you to file issues with any suggestions or enhancements in the ML.NET GitHub repo.

This blog was authored by Cesar de la Torre, Gal Oshri, John Alexander, and Ankit Asthana

Thanks,

The ML.NET Team

Announcing .NET Core 2.2 Preview 2

$
0
0

Today, we are announcing .NET Core 2.2 Preview 2. We have great improvements that we want to share and that we would love to get your feedback on, either in the comments or at dotnet/core #1938.

ASP.NET Core 2.2 Preview 2 and Entity Framework 2.2 Preview 2 are also releasing today. We are also announcing C# 7.3 and ML.NET 0.5.

You can see complete details of the release in the .NET Core 2.2 Preview 2 release notes. Related instructions, known issues, and workarounds are included in the releases notes. Please report any issues you find in the comments or at  dotnet/core #1938.

Thanks for everyone that contributed to .NET Core 2.2. You’ve helped make .NET Core a better product!

Download .NET Core 2.2

You can download and get started with .NET Core 2.2, on Windows, macOS, and Linux:

Docker images are available at microsoft/dotnet for .NET Core and ASP.NET Core.

.NET Core 2.2 Preview 2 can be used with Visual Studio 15.8, Visual Studio for Mac and Visual Studio Code.

Tiered Compilation Enabled

The biggest change in .NET Core 2.2 Preview 2 is tiered compilation is enabled by default. We announced that tiered compilation was available as part of the .NET Core 2.1 release. At that time, you had to enable tiered compilation via application configuration or an environment variable. It is now enabled by default and can be disabled, as needed.

You can see the benefit of tiered compilation in the image below. The baseline is .NET Core 2.1 RTM, running in a default configuration, with tiered compilation disabled. The second scenario has tiered compilation. You can see a significant request-per-second (RPS) throughput benefit with tiered compilation enabled.

The numbers in the chart are scaled so that baseline always measures 1.0. That approach makes it very easy to calculate performance changes as a percentage. The first two tests are TechEmpower benchmarks and the last one is Music Store, our frequent sample ASP.NET app.

Platform Support

.NET Core 2.2 is supported on the following operating systems:

  • Windows Client: 7, 8.1, 10 (1607+)
  • Windows Server: 2008 R2 SP1+
  • macOS: 10.12+
  • RHEL: 6+
  • Fedora: 27+
  • Ubuntu: 14.04+
  • Debian: 8+
  • SLES: 12+
  • openSUSE: 42.3+
  • Alpine: 3.7+

Chip support follows:

  • x64 on Windows, macOS, and Linux
  • x86 on Windows
  • ARM32 on Linux (Ubuntu 18.04+, Debian 9+)

Closing

Please download and test .NET Core 2.2 Preview 2. We’re looking for feedback on the release with the intent of shipping the final version later this year.

We recently shared how Bing.com runs of .NET Core 2.1. The Bing.com site experienced significant benefits when it moved to .NET Core 2.1. Please do check out that post if you are interested in case study of running .NET Core in production. You may also want to take a look at the .NET Customers site, if you are interested in a broader set of customer stories.

Announcing .NET Framework 4.8 Early Access build 3646

$
0
0

Today, we are happy to share an Early Access build for the .NET Framework 4.8. This includes an updated .NET 4.8 runtime as well as the .NET 4.8 Developer Pack (a a single package that bundles the .NET Framework 4.8 runtime, the .NET 4.8 Targeting Pack, and the .NET Framework 4.8 SDK).

Please help us ensure this is a high quality and compatible release by trying out this build and exploring the new features.

Next steps:
To explore the new features, download the .NET 4.8 Developer Pack build 3646. Instead, if you want to try just the .NET 4.8 runtime, you can download either of these:

Please provide your feedback by reporting an issue at the .NET Framework Early Access GitHub repository.

Note: this release is still under development, you can expect to see more features and fixes in future preview builds. Also, a reminder that this build is not supported for production use.

This preview build 3646 includes improvements/fixes in the following areas:

  • [Runtime] JIT and NGEN improvements
  • [Windows Forms] Accessibility enhancements
  • [WPF] SelectionTextBrush Property
  • [BCL] Updated ZLib

You can see the complete list of improvements in this build here.

.NET Framework build 3646 is also included in the next update for Windows 10. You can sign up for Windows Insiders to validate that your applications work great on the latest .NET Framework included in the latest Windows 10 releases.

 

Runtime – JIT improvements

The JIT in .NET 4.8 is based on .NET Core 2.1.  All bug fixes and many code generation-based performance optimizations from .NET Core 2.1 are now available in the .NET Framework. 

 

Runtime – NGEN improvements

NGEN images in the .NET Framework no longer contain writable & executable sections. This reduces the surface area available to attacks that attempt to execute arbitrary code by modifying memory that will be executed.

While there will still be writable & executable data in memory at runtime, this change removes those mapped from NGEN images, allowing them to run in restricted environments that don’t permit executable/writable sections in images. 

 

Windows Forms – Accessibility Enhancements

In .NET Framework 4.8 WinForms is adding three new features to enable developers to write more accessible applications. The features added are intended to make communication of application data to visually impaired users more robust. We’ve added support for ToolTips when a user navigates via the keyboard, we’ve added LiveRegions and Notification Events to many commonly used controls.

To enable these features your application needs to have the following AppContextSwitches enabled in the App.config file:

 

UIA LiveRegions Support in Labels and StatusStrips

UIA Live Regions allow application developers to notify screen readers of a text change on a control that is located apart from the location where the user is working. Examples of where this would come in handy could be a StatusStrip that shows a connection status. If the connection is dropped and the Status changes, the developer might want to notify the screen reader of this change. Windows Forms has implemented UIA LiveRegions for both the Label control and the StatusStrip control.

Example use of the LiveRegion in a Label Control:

Narrator will now announce “Ready” Regardless of where the user is interacting with the application.
You can also implement your UserControl as a Live region:

 

UIA Notification Events

In Windows 10 Fall Creators Update Windows introduced a new method of having an application notify Narrator that content has changed, and Narrator should announce the change. The UIA Notification event provides a way for your app to raise a UIA event which leads to Narrator simply making an announcement based on text you supply with the event, without the need to have a corresponding control in the UI. In some scenarios, this could be a straightforward way to dramatically improve the accessibility of your app.  For more information about UIA Notification Events, see this blog post.

An example of where a Notification might come in handy is to notify the progress of some process that may take some time.

An example of raising the Notification event:

 

ToolTips on keyboard access

Currently a control tooltip can only be triggered to pop up by moving a mouse pointer into the control. This new feature enables a keyboard user to trigger a control’s tooltip by focusing the control using a Tab key or arrow keys with or without modifier keys. This particular accessibility enhancement requires an additional AppContextSwitch as seen in the following example:

  1. Create a new WinForms application
  2. Add the following XML to the App.config file

3. Add several buttons and a ToolTip control to the application’s form.

4. Set tooltips for the buttons.

5. Run the application and navigate between the buttons using a keyboard:

DataGridView control accessible hierarchy changes

Currently the accessible hierarchy (UI Automation tree) shows the editing box tree element as a child of currently edited cell but not as a root child element of DataGridView. The hierarchy tree update can be observed using Inspect tool:

 

 

WPF – SelectionTextBrush Property for use with Non-Adorner Based Text Selection

In the .NET Framework 4.7.2 WPF added the ability to draw TextBox and PasswordBox text selection without using the adorner layer (See Here). The foreground color of the selected text in this scenario was dictated by SystemColors.HighlightTextBrush.

In the .NET Framework 4.8 we are adding a new property, SelectionTextBrush, that allows developers to select the specific brush for the selected text when using non-adorner based text selection.

This property works only on TextBoxBase derived controls and PasswordBox in WPF applications with non-adorner based text selection enabled. It does not work on RichTextBox. If non-adorner based text selection is not enabled, this property is ignored.

To use this property, simply add it to your XAML code and use the appropriate brush or binding.

The resulting text selection will look like this:

You can combine the use of SelectionBrush and SelectionTextBrush to generate any color combination of background and foreground that you deem appropriate.

 

BCL – Updated ZLib

Starting with .NET Framework 4.5 we used the native version of ZLib (a native external compression library used for data compression) from http://zlib.net in clrcompression.dll in order to provide an implementation for the deflate algorithm. In .NET Framework 4.8 we updated clrcompression.dll to use version 1.2.11 which includes several key improvements and fixes.

 

Closing

Try out these new features in .NET Framework 4.8 early access build 3646 and share your feedback by reporting an issue at the .NET Framework Early Access GitHub repository.

Announcing Cumulative Updates for .NET Framework for Windows 10 October 2018 Update

$
0
0

We deliver .NET Framework updates nearly every month, through Windows Update and other distribution channels. We are making changes to the way that we deliver those updates. We’ll soon start delivering a Cumulative Update for .NET Framework alongside the Windows 10 Cumulative Update, starting with the Windows 10 October 2018 Update. This new approach will give you more flexibility on installing .NET Framework updates.

What is the new Cumulative Update for  .NET Framework?

Starting with Windows 10 October 2018 Update and Windows Server 2019, .NET Framework fixes will be delivered through a Cumulative Update for .NET Framework.

We are making this change to enable the following:

  • Provide more flexibility for installing .NET Framework updates (for example, IT Admins can more selectively test line of business applications before broadly deploying).
  • Ability to respond to critical customer needs, when needed, with higher velocity, with standalone .NET Framework patches.

The Cumulative Update for .NET Framework will have the following characteristics:

  • Independent – Released separately from the Windows Cumulative Update
  • Cumulative – The latest patch will fully update all .NET Framework versions on your system
  • Same cadence – The Cumulative Update for .NET Framework will be released on the same cadence as Windows 10.

What should I expect?

You can expect the following new experiences.

Windows Update users:

If you rely on Windows Update to keep your machine up to date and have automatic updates enabled, you will not notice any difference.  Updates for both Windows and the .NET Framework will be silently installed, and as usual you may be prompted for a reboot after installation.

If you manage Windows Update manually, you will notice a Cumulative Update for .NET Framework update alongside the Windows cumulative update. Please continue to apply the latest updates to keep your system up to date.

Image: Cumulative Update for .NET Framework delivered via Windows Update

You can continue to rely on existing guidance for advanced Windows Update settings.

Systems and IT Administrators:

  • System administrators relying on Windows Server Update Services (WSUS) and similar update management applications will observe a new update for .NET Framework when checking for updates applicable to upcoming versions of Windows 10 October 2018 Update and Windows Server 2019.
  • The Cumulative Update for .NET Framework Classifications remain the same as for the Cumulative Update for Windows and continue to show under the same Windows Products. Updates that deliver new Security content will carry the “Security Updates” classification and updates carrying solely new quality updates will carry either the “Updates” or “Critical Updates” classification, depending on their criticality.
  • System administrators that rely on the Microsoft Update Catalog will be able to access the Cumulative Update for .NET Framework by searching for each releases’ Knowledge Based (KB) update number. Note that a single update will contain fixes for both .NET Framework 3.5 and 4.7.2 products.
  • You can use the update title to filter between the Windows Cumulative updates and .NET Framework updates. All other update artifacts are expected to remain the same.

Image: Cumulative Update for .NET Framework delivered via WSUS Administration console

.NET Framework updates across Windows versions

.NET Framework updates will be delivered in the following way:

  • Windows 10 October 2018 Update (version 1809) – One Cumulative Update for .NET Framework, alongside the Windows Cumulative Update.
  • Windows 10 April 2018 (version 1803) and earlier versions of Windows 10 – One Windows Cumulative Update (which includes .NET Framework updates), per Windows version.
  • Windows 7 and 8.1 – Multiple .NET Framework updates, per Windows version.

.NET Framework updates are delivered on the same servicing cadence as Windows 10. We deliver different types of updates on different schedules, as described below.

  • The security and quality updates for .NET Framework will be released on Patch Tuesday, the second Tuesday of each month, containing important security and critical quality improvements.
  • Each new security and quality update will supersede and replace the last security and quality update release.
  • Preview updates for .NET Framework will be released one to two weeks after the Patch Tuesday release, for non-security fixes as a limited distribution release (will not be installed automatically).
  • Out-of-band releases are reserved for situations where customer systems must be updated quickly and outside of the regular schedule, to fix security vulnerabilities or to resolve critical quality issues.

For more information about .NET Framework update model for previous versions of Windows, please refer to: Introducing the .NET Framework Monthly Rollup and .NET Framework Monthly Rollups Explained.

Validating the Quality of Updates

We extensively validate the quality of these updates before publishing them. .NET Framework updates are installed by many customers on many machines. It is often the case for Patch Tuesday updates that they contain security updates and it is important that you can apply those quickly throughout your environment. We are continually improving our validation system to ensure high-quality updates.

We use the following approaches to validate quality:

  • Extensive functional testing with in-house regression tests.
  • Compatibility testing with Microsoft applications, servers and services.
  • Compatibility testing with third-party applications that have been submitted to the .NET Framework compatibility lab. You can submit your app at dotnet@microsoft.com.
  • Security Update Validation Program (SUVP).
  • Listen to customer feedback from previous preview releases and Windows Insider builds.

 

FAQ

Will installing a Cumulative Update for .NET Framework upgrade my .NET Framework version?

No. These updates will not upgrade you to a new .NET Framework version. They will update the .NET Framework version you already have installed.

Will I need to reboot after installing the Cumulative Update for .NET Framework ?

In most cases, yes.

Will I need an additional reboot when installing the Cumulative Update for .NET Framework together with the Windows Cumulative update?

Windows Update will orchestrate making sure updates that ship at the same time are processed together and only require a single reboot. Guidance to WSUS/IT Admins is to continue to ensure that updates are grouped and deployed together to avoid any potential additional reboots.

Is there a security-only variant of the Cumulative Update for .NET Framework?

No. This approach limits the number of updates to manage and aligns with the model used by Windows 10.

What do I need to do to update .NET Framework 3.5?

Install the Cumulative Update for .NET Framework. It includes .NET Framework 3.5 fixes.

I have concerns about the quality of .NET Framework fixes. What can I do?

You can submit your application for testing in our compatibility lab (send mail to dotnet@microsoft.com) and install .NET Framework Preview of Quality updates to validate compatibility.

Is Microsoft producing new types of patches in addition to the new Cumulative Update for .NET Framework?

No new standalone updates are planned.

.NET Framework September 2018 Preview of Quality Rollup

$
0
0

Late last week we released the September 2018 Preview of Quality Rollup.

Quality and Reliability

This release contains the following quality and reliability improvements.

CLR

  • Updated code to prevent errors regarding invalid date format when Japanese Era 4 is used with a future date [568291]
  • Parsing Japanese dates having a year number exceeding the number of years in that date era will succeed instead of throwing errors [603100]
  • When asynchronously reading a process output an IndexOutOfRangeException is thrown when less than a character’s worth of bytes is read at the beginning of a line [621951]
  • Fix in the JIT compiler for a rare case of struct field assignments, described here: https://github.com/Microsoft/dotnet/issues/779 [641182]
  • DateTime.Now and DateTime.Utc will now always be synchronized with the system time, DateTime and DateTimeOffset operations will continue to work as it used to work [645660]
  • Spin-waits in several synchronization primitives were conditionally improved to perform better on Intel Skylake and more recent microarchitectures. To enable these improvements, set the new configuration variable COMPlus_Thread_NormalizeSpinWait to 1. [647729]
  • Corrected JIT optimization which resulted in removal of interlocked Compare Exchange operation [653568]

WPF

  • Under certain circumstances, WPF applications using the spell-checker that use custom dictionaries can throw unexpected excpetions and crash [622262]

Note: Additional information on these improvements is not available. The VSTS bug number provided with each improvement is a unique ID that you can give Microsoft Customer Support, include in StackOverflow commentsor use in web searches.

Getting the Update

The Security and Quality Rollup is available via Windows Update, Windows Server Update Services, Microsoft Update Catalog, and Docker.

Microsoft Update Catalog

You can get the update via the Microsoft Update Catalog. For Windows 10, .NET Framework updates are part of the Windows 10 Monthly Rollup.

The following table is for Windows 10 and Windows Server 2016+ versions.

Product Version Preview of Quality Rollup KB
Windows 10 1803 (April 2018 Update) Catalog
4458469
.NET Framework 3.5 4458469
.NET Framework 4.7.2 4458469
Windows 10 1709 (Fall Creators Update) Catalog
4457136
.NET Framework 3.5 4457136
.NET Framework 4.7.1, 4.7.2 4457136
Windows 10 1703 (Creators Update) Catalog
4457141
.NET Framework 3.5 4457141
.NET Framework 4.7, 4.7.1, 4.7.2 4457141
Windows 10 1607 (Anniversary Update)
Windows Server 2016
Catalog
4457127
.NET Framework 3.5 4457127
.NET Framework 4.6.2, 4.7, 4.7.1, 4.7.2 4457127

The following table is for earlier Windows and Windows Server versions.

Product Version Preview of Quality Rollup KB
Windows 8.1
Windows RT 8.1
Windows Server 2012 R2
Catalog
4458613
.NET Framework 3.5 4457009
.NET Framework 4.5.2 4457017
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4458612
Windows Server 2012 Catalog
4458612
.NET Framework 3.5 4457008
.NET Framework 4.5.2 4457018
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457014
Windows 7
Windows Server 2008 R2
Catalog
4458611
.NET Framework 3.5.1 4457008
.NET Framework 4.5.2 4457019
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457014
Windows Server 2008 Catalog
4458614
.NET Framework 2.0, 3.0 4457007
.NET Framework 4.5.2 4457019
.NET Framework 4.6 4457014

Previous Monthly Rollups

The last few .NET Framework Monthly updates are listed below for your convenience:


.NET Core October 2018 Update – NET Core 2.1.5 & SDK 2.1.403

$
0
0

.NET Core October 2018 Update

Today, we are releasing the .NET Core October 2018 Update. This update includes .NET Core 2.1.5 and .NET Core SDK 2.1.403 and contains important reliability fixes.

Getting the Update

The latest .NET Core updates are available on the .NET Core download page. This update will be included in the Visual Studio 15.8.7 update, which will be coming later this month.

See the .NET Core 2.1.5 release notes for details on the release including a detailed commit list.

Docker Images

.NET Docker images have been updated for today’s release. The following repos have been updated.

Note: Look at the “Tags” view in each repository to see the updated Docker image tags.

Note: You must re-pull base images in order to get updates. The Docker client does not pull updates automatically.

Lifecycle Updates

The end of life schedule for .NET Core 2.0 was updated and announced in June and that day has arrived. .NET Core 2.0 was released August 14, 2017 and really began to open the path to the capabilities envisioned for .NET Core. Instructions for upgrading from .NET Core 2.0 to .NET Core 2.1 can be found in the following documents:

.NET Core 2.1 has been declared the long-term support (LTS) release. We recommend that you make .NET Core 2.1 your new standard for .NET Core development.

Azure App Services

Deployment of .NET Core 2.1.5 to Azure App Services has begun and the West Central US region will be live this morning. Remaining regions will be updated over the next few days.

App Services and end of life .NET Core versions

We are working through a maintenance plan for the versions of .NET Core and the .NET Core SDK which will be available on Azure App Services. Currently the available versions include some which are quite old and need to be removed. The essential idea will be to keep the latest patch version of in-support version channels (eg 1.0, 2.1, etc). Because the SDK is capable of building applications for any version of the runtime, only the latest will be retained.

.NET Core 2.0 reached end of life on October 1, 2018 and is no longer eligible for support or updates which means that it should be removed from App Services. However, we understand that many applications on App Services use 2.0 and removing it from the service too quickly would be disruptive. To give ample opportunity to migrate applications, we are going to ‘attach’ the 2.0 App Services maintenance to the .NET Core 1.0 and 1.1 end of life schedule, which concludes June 27, 2019 per the .NET Core Support Policy. After that date, .NET Core 1.0, 1.1 and 2.0 will be removed from Azure App Services.

Previous .NET Core Updates

The last few .NET Core updates follow:

September 2018 Update
August 2018 Update
July 2018 Update
June 2018 Update
May 2018 Update

Update on .NET Core 3.0 and .NET Framework 4.8

$
0
0

In May, we announced .NET Core 3.0, the next major version of .NET Core that adds support for building desktop applications using WinForms, WPF, and Entity Framework 6. We also announced some exciting updates to .NET Framework which enable you to use the new modern controls from UWP in existing WinForms and WPF applications.

Today, Microsoft is sharing a bit more detail on what we’re building and the future of .NET Core and .NET Framework.

.NET Core 3.0 addresses three scenarios our .NET Framework developer community has asked for, including:

  • Side-by-side versions of .NET that support WinForms and WPF: Today there can only be one version of .NET Framework on a machine. This means that when we update .NET Framework on patch Tuesday or via updates to Windows there is a risk that a security fix, bug fix, or new API can break applications on the machine. With .NET Core, we solve this problem by allowing multiple versions of .NET Core on the same machine. Applications can be locked to one of the versions and can be moved to use a different version when ready and tested.

  • Embed .NET directly into an application: Today, since there can only be one version of .NET Framework on a machine, if you want to take advantage of the latest framework or language feature you need to install or have IT install a newer version on the machine. With .NET Core, you can ship the framework as part of your application. This enables you to take advantage of the latest version, features, and APIs without having to wait for the framework to be installed.

  • Take advantage of .NET Core features: .NET Core is the fast-moving, open source version of .NET. Its side-by-side nature enables us to quickly introduce new innovative APIs and BCL (Base Class Library) improvements without the risk of breaking compatibility. Now WinForms and WPF applications on Windows can take advantage of the latest .NET Core features, which also includes more fundamental fixes for an even better high-DPI support.

.NET Framework 4.8 addresses three scenarios our .NET Framework developer community has asked for, including:

  • Modern browser and modern media controls: Today, .NET desktop applications use Internet Explorer and Windows Media Player for showing HTML and playing media files. Since these legacy controls don’t show the latest HTML or play the latest media files, we are adding new controls that take advantage of Microsoft Edge and newer media players to support the latest standards.

  • Access to touch and UWP Controls: UWP (Universal Windows Platform) contains new controls that take advantage of the latest Windows features and touch displays. You won’t have to rewrite your applications to use these new features and controls. We are going to make them available to WinForms and WPF so that you can take advantage of these new features in your existing code.

  • High DPI improvements: The resolution of displays is steadily increasing to 4K and now even 8K resolutions. We want to make sure your existing WinForms and WPF applications can look great on these displays.

Given these updates, we’re hearing a few common questions, such as “What does this mean for the future of .NET Framework?” and “Do I have to move off .NET Framework to remain supported?” While we’ll provide detailed answers below, the key takeaway is that we will continue to move forward and support the .NET Framework, albeit at a slower pace.

How Do We Think of .NET Framework and .NET Core Moving Forward?

.NET Framework is the implementation of .NET that’s installed on over one billion machines and thus needs to remain as compatible as possible. Because of this, it moves at a slower pace than .NET Core. I mentioned above that even security and bug fixes can cause breaks in applications because applications depend on the previous behavior. We will make sure that .NET Framework always supports the latest networking protocols, security standards, and Windows features.

.NET Core is the open source, cross-platform, and fast-moving version of .NET. Because of its side-by-side nature it can take changes that we can’t risk applying back to .NET Framework. This means that .NET Core will get new APIs and language features over time that .NET Framework cannot. At Build I did a demo showing how the file APIs were faster on .NET Core. If we put those same changes into .NET Framework we could break existing applications, and we don’t want to do that.

We will continue to make it easier to move applications to .NET Core. .NET Core 3.0 takes a huge step by adding WPF, WinForms and Entity Framework 6 support, and we will keep porting APIs and features to help close the gap and make migration easier for those who chose to do so.

If you have existing .NET Framework applications, you should not feel pressured to move to .NET Core. Both .NET Framework and .NET Core will move forward, and both will be fully supported, .NET Framework will always be a part of Windows. But moving forward they will contain somewhat different features. Even inside of Microsoft we have many large product lines that are based on .NET Framework and will remain on .NET Framework.

In conclusion, this is an amazing time to be a .NET developer. We are continuing to advance the .NET Framework with some exciting new features in 4.8 to make your desktop applications more modern. .NET Core is expanding into new areas like Desktop, IoT and Machine Learning. And we are making it easier and easier to share code between all the .NET’s with .NET Standard.

Scott Hunter, Director of Program Management for .NET

Scott Hunter works for Microsoft as a Director of Program Management for .NET. This include .NET Framework, .NET Core, Managed Languages, ASP.NET, Entity Framework and .NET tooling. Before this Scott was the CTO of several startups including Mustang Software and Starbase, where he focused on a variety of technologies – but programming the Web has always been his real passion.

Announcing ML.NET 0.6 (Machine Learning .NET)

$
0
0

Today we’re announcing our latest monthly release: ML.NET 0.6! ML.NET is a cross-platform, open source machine learning framework for .NET developers. We want to enable every .NET developer to train and use machine learning models in their applications and services. If you haven’t tried ML.NET yet, here’s how you can get started!

The ML.NET 0.6 release delivers several new exciting enhancements:

  • New API for building and using machine learning models

    Our main focus was releasing the first iteration of new ML.NET APIs for building and consuming models. These new, more flexible, APIs enable new tasks and code workflow that weren’t possible with the previous LearningPipeline API. We are starting to deprecate the current LearningPipeline API.

    This is a significant change intended to make machine learning easier and more powerful for you. We would love your feedback via an open discussion on GitHub to help shape the long term ML.NET API to maximize your productivity, flexibility and ease of use.

    Learn more about the new ML.NET API

  • Ability to score pre-trained ONNX Models

    Many scenarios like Image Classification, Speech to Text, and translation benefit from using predictions from deep learning models. In ML.NET 0.5 we added support for using TensorFlow models. Now in ML.NET 0.6 we’ve added support for getting predictions from ONNX models.

    Learn more about using ONNX models in ML.NET

  • Significant performance improvements for model prediction, .NET type system consistency, and more

    We know that application performance is critical. In this release, we’ve increased getting model predictions performance 100x or more.

    Additional enhancements include:

    • improvements to ML.NET TensorFlow scoring
    • more consistency with the .NET type-system
    • having a model deployment suitable for serverless workloads like Azure Functions

    Learn more about performance improvements, enhanced TensorFlow support and type-system improvements.

Finally, we’re looking forward to engaging with the open-source community in developing and growing support for machine learning in .NET further. We have already taken steps to integrate with Infer.NET, a project from Microsoft research which has just recently been released as open source project under the .NET Foundation. Infer.NET will extend ML.NET for statistical modelling and online learning and is available in the Microsoft.ML.Probabilistic namespace.

The next sections explain in deeper details the announcements listed above.

New API for building and consuming a Machine Learning model

While the existing LearningPipeline API released with ML.NET 0.1 was easy to get started with, there were some limitations explained in our previous ML.NET blog post. Moving forward the LearningPipeline API has been moved into the Microsoft.ML.Legacy namespace (e.g. Sentiment Analysis based on Binary Classification with the LearningPipeline API).

The new API is designed to support a wider set of scenarios and closely follows ML principles and naming from other popular ML related frameworks like Apache Spark and Scikit-Learn.

Let’s walkthrough an example to build a sentiment analysis model with the new APIs and introduce the new concepts along the way.

Building an ML Model involves the following high-level steps:

High level steps to build an ML model

To go through these steps with ML.NET there are essentially five main concepts with the new API, let’s take a look with them through this example:

Step 1: Load data

Get started

When building a model with ML.NET you start by creating an ML Context or environment. This is comparable to using DbContext in Entity Framework, but of course, in a completely different domain. The environment provides a context for your ML job that can be used for exception tracking and logging.

var env = new LocalEnvironment();

We are working on bringing this concept/naming closer to EF and other .NET frameworks.

Load your data

One of the most important things is, as always, your data! Load a Dataset into the ML pipeline to be used to train your model.

In ML.NET, data is similar to a SQL view. It is lazily evaluated, schematized, heterogenous. In this example, the sample dataset looks like this:

Toxic (label) Comment (text)
1 ==RUDE== Dude, you are rude …
1 == OK! == IM GOING TO VANDALIZE …
0 I also found use of the word “humanists” confusing …
0 Oooooh thank you Mr. DietLime …

To read in this data you will use a data reader which is an ML.NET component. The reader takes in the environment and requires you to define the schema of your data. In this case the first column (Toxic) is of type Boolean and the “label” (meaning also the prediction) and the second column (Comment) is the feature of type text/string that we are going to use to predict the sentiment on.

var reader = TextLoader.CreateReader(env, ctx => (label: ctx.LoadBool(0),
                                                  text: ctx.LoadText(1)));

var traindata = reader.Read(new MultiFileSource(TrainDataPath));

Your data schema consists of two columns:

  • a boolean column (Toxic) which is the “label” and positioned as the first column.
  • a text column (Comment) which is the feature we use to predict.

Note that this case, loading your training data from a file, is the easiest way to get started, but ML.NET also allows you to load data from databases or in-memory collections.

Step 2: Extract features (transform your data)

Machine learning algorithms understand featurized data, so the next step is for us to transform our textual data into a format that our ML algorithms recognize. In order to do so we create an estimator and use the FeaturizeText transform as shown in the following snippet:

var est = reader.MakeNewEstimator().Append(row =>
{
    var featurizedText = row.text.FeaturizeText();  //Convert text to numeric vectors
//...
});

An Estimator is an object that learns from data. A transformer is the result of this learning. A good example is training the model with estimator.Fit(), which learns on the training data and produces a machine learning model.

Step 3: Train your model

Add a selected ML Learner (Algorithm)

Now that our text has been featurized, the next step is to add a learner. In this case we will use the SDCAClassifier learner.

Adding a learner also requires us to create an additional context, since we are performing a binary classification ML task for our sentiment analysis.

var bctx = new BinaryClassificationContext(env);

var est = reader.MakeNewEstimator().Append(row =>
{
    var featurizedText = row.text.FeaturizeText();  //Convert text to numeric vectors
    var prediction = bctx.Trainers.Sdca(row.label, featurizedText);  //Specify SDCA trainer
    return (row.label, prediction);  //Return label and prediction columns
});

The learner takes in the label, and the featurized text as input parameters and returns a prediction which contains the predictedLabel, probability and score field triplet.

Train your model

Once the estimator has been defined, you train your model using the Fit() API. This returns a model which to use for predictions.

var model = est.Fit(traindata);

Step 4: Evaluate your trained model

Now that you’ve created and trained the model, evaluate it with a different dataset for quality assurance and validation with code similar to the following:

// Evaluate the model
var predictions = model.Transform(testdata);
var metrics = bctx.Evaluate(predictions, row => row.label, row => row.prediction);
Console.WriteLine("PredictionModel quality metrics evaluation");
Console.WriteLine("------------------------------------------");
Console.WriteLine($"Accuracy: {metrics.Accuracy:P2}");

The code snippet implements the following:

  • Loads the test dataset.
  • Evaluates the model and create metrics.
  • Shows the accuracy of the model from the metrics.

And now you have a trained model for use in your applications and services.

Step 5: Model Consumption

Now, you can predict with test data by consuming the model you just created and trained.

The following code is a sample you would write in your “production” application when predicting something by scoring with the model:

// Create the prediction function 	
var predictionFunct = model.AsDynamic.MakePredictionFunction<SentimentIssue, SentimentPrediction>(env);	
// Predict the sentiment!	
var resultprediction = predictionFunct.Predict(new SentimentIssue	
                                              {	
                                                 text = "This is a very rude movie"	
                                              });	

Console.WriteLine($"Text: {sampleStatement.text} | Prediction: {(resultprediction.PredictionLabel ? "Negative" : "Positive")} sentiment");

In that sample, you can guess that the prediction won’t be positive because of the provided text.. 😉

You can find all the code of the sentiment analisys example here.

Ability to score pre-trained ONNX Models

ONNX is an open and iteroperable model format that enables using models trained in one framework (ie scikit-learn, TensorFlow, xgboost, etc) and use them in another (like ML.NET).

In ML.NET v0.3, we added the capability of exporting ML.NET models to the ONNX-ML format so additional execution environments could run the model (such as Windows ML).

In this new v0.6 release, ML.NET can also use ONNX models to score/predict trained ONNX models which use the ONNX standard v1.2. We’ve enabled this using a new transformer and runtime for scoring ONNX models, as ilustrated below.

Process exporting and scoring ONNX models

There are a large variety of ONNX models created and trained in multiple frameworks that can export models to ONNX format. Those models can be used for tasks like image classification, emotion recognition, and object detection.

The ONNX transformer in ML.NET enables providing some data to an existing ONNX model (such as the models above) and getting the score (prediction) from it.

The ONNX runtime in ML.NET currently supports only Windows on x64 CPU. Support for other platforms (Linux and macOS) are in the roadmap.

The way you use an ONNX model in your estimator is by simply adding it with this line of code similar to the following:

.Append(row => (row.name, softmaxout_1: row.data_0.ApplyOnnxModel(modelFile)));

Further example usage can be found here.

Improvements to TensorFlow model scoring functionality

In this release, we’ve made it easier to use TensorFlow models in ML.NET. Using the TensorFlow scoring transform requires knowing which node of the model you want to retrieve results from, so we’ve added an API to discover the nodes in the TensorFlow model to help identify the input and output of a TensorFlow model. Example usage can be found here.

Additionally, previously in ML.NET 0.5 we only enabled using ‘frozen’ TensorFlow models. Now in ML.NET 0.6, TensorFlow models in the saved model format can also be used.

Performance improvements

In the ML.NET 0.6 release, we made several performance improvements in making single predictions from a trained model. The first improvement comes from moving from the legacy LearningPipeline API to the new Estimators API. The second improvement comes from optimizing the performance of PredictionFunction in the new API.

To learn about the details of the benchmark results, please see the GitHub issue which covers this in detail.

  • Predictions on Iris data: 3,272x speedup (29x speedup with the Estimators API, with a further 112x speedup with improvements to PredictionFunction).
  • Predictions on Sentiment data: 198x speedup (22.8x speedup with the Estimators API, with a further 8.68x speedup with improvements to PredictionFunction). This model contains a text featurizer, so it is not surprising that we see a smaller gain.
  • Predictions on Breast Cancer data: 6,541x speedup (59.7x speedup with the Estimators API, with a further 109x speedup with improvements to PredictionFunction).

Type system improvements

To make ML.NET easier to use and to take advantage of innovation in .NET, in ML.NET 0.6 we have replaced the Dv type system with .NET’s standard type system.

  • ML.NET previously had its own type system which helped it more efficiently deal with things like missing values (a common case in ML). This type system required users to work with types like DvText, DvBool, DvInt4, etc.
  • One effect of this change is that only floats and doubles have missing values, represented by NaN. More information can be found here.

Additionally, you can now also deploy ML.NET in additional scenarios using .NET app models such as Azure Functions easily without convoluted workarounds, thanks to the improved approach to dependency injection.

Infer.NET is now open-source and becoming part of the ML.NET family

On October 5th 2018, Microsoft Research announced the open-sourcing of Infer.NET – a cross-platform framework for model-based machine learning.

Infer.NET differs from traditional machine learning frameworks in that it requires users to specify a statistical model of their problem. This allows for high interpretability, incorporating domain knowledge, doing unsupervised/semi-supervised learning, as well as online inference – the ability to learn as new data arrives. The approach and many of its applications are described in our free online book for beginners.

Places where Infer.NET is used at Microsoft include TrueSkill – a skill rating system for matchmaking in Halo and Gears of War, Matchbox – a recommender system in Azure Machine Learning, and Alexandria – automatic knowledge base construction for Satori, to name a few.

We’re working with the Infer.NET team to make it part of the ML.NET family. Steps already taken in this direction include releasing under the .NET Foundation and changing the package name and namespaces to Microsoft.ML.Probabilistic.

Additional resources

  • The most important ML.NET concepts for understanding the new API are introduced here.

  • A cookbook that shows how to use these APIs for a variety of existing and new scenarios can be found here.

Provide your feedback on the new API

Provide feedback image with two people and a swimlane

As mentioned at the beginning of the blog post, the new API is a significant change, so we also want to create an open discussion where you can provide feedback and help shape the long-term API for ML.NET.

Want to get involved? Start by providing feedback at this blog post comments below or through issues at the ML.NET GitHub repo

Get started!

If you haven’t already, get started with ML.NET here!

Next, explore some other great resources:

We look forward to your feedback and welcome you to file issues with any suggestions or enhancements in the ML.NET GitHub repo.

This blog was authored by Cesar de la Torre, Ankit Asthana, Chris Lauren plus additional reviewers in the ML.NET team

Thanks,

The ML.NET Team

.NET Framework October 2018 Security and Quality Rollup

$
0
0

Today, we released the October 2018 Security and Quality Rollup.

Security

No new security fixes.  See .NET Framework September 2018 Security and Quality Rollup for the latest security update.

Quality and Reliability

This release contains the following quality and reliability improvements.

CLR

  • Updated code to prevent errors regarding invalid date format when Japanese Era 4 is used with a future date [568291]
  • Parsing Japanese dates having a year number exceeding the number of years in that date era will succeed instead of throwing errors [603100]
  • When asynchronously reading a process output an IndexOutOfRangeException is thrown when less than a character’s worth of bytes is read at the beginning of a line [621951]
  • Fix in the JIT compiler for a rare case of struct field assignments, described here: https://github.com/Microsoft/dotnet/issues/779 [641182]
  • DateTime.Now and DateTime.Utc will now always be synchronized with the system time, DateTime and DateTimeOffset operations will continue to work as it used to work [645660]
  • Spin-waits in several synchronization primitives were conditionally improved to perform better on Intel Skylake and more recent microarchitectures. To enable these improvements, set the new configuration variable COMPlus_Thread_NormalizeSpinWait to 1. [647729]
  • Corrected JIT optimization which resulted in removal of interlocked Compare Exchange operation [653568]

WPF

  • Under certain circumstances, WPF applications using the spell-checker that use custom dictionaries can throw unexpected exceptions and crash [622262]

Note: Additional information on these improvements is not available. The VSTS bug number provided with each improvement is a unique ID that you can give Microsoft Customer Support, include in StackOverflow commentor use in web searches.

Getting the Update

The Security and Quality Rollup is available via Windows Update, Windows Server Update Services, Microsoft Update Catalog, and Docker.

Microsoft Update Catalog

You can get the update via the Microsoft Update Catalog. For Windows 10, .NET Framework updates are part of the Windows 10 Monthly Rollup.

The following table is for Windows 10 and Windows Server 2016+ versions.

Product Version Preview of Quality Rollup KB
Windows 10 1803 (April 2018 Update) Catalog
4462919
.NET Framework 3.5 4462919
.NET Framework 4.7.2 4462919
Windows 10 1709 (Fall Creators Update) Catalog
4462918
.NET Framework 3.5 4462918
.NET Framework 4.7.1, 4.7.2 4462918
Windows 10 1703 (Creators Update) Catalog
4462937
.NET Framework 3.5 4462937
.NET Framework 4.7, 4.7.1, 4.7.2 4462937
Windows 10 1607 (Anniversary Update)
Windows Server 2016
Catalog
4462917
.NET Framework 3.5 4462917
.NET Framework 4.6.2, 4.7, 4.7.1, 4.7.2 4462917

The following table is for earlier Windows and Windows Server versions.

Product Version Preview of Quality Rollup KB
Windows 8.1
Windows RT 8.1
Windows Server 2012 R2
Catalog
4459924
.NET Framework 3.5 4457009
.NET Framework 4.5.2 4457017
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457015
Windows Server 2012 Catalog
4459923
.NET Framework 3.5 4457006
.NET Framework 4.5.2 4457018
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457014
Windows 7
Windows Server 2008 R2
Catalog
4459922
.NET Framework 3.5.1 4457008
.NET Framework 4.5.2 4457019
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457016
Windows Server 2008 Catalog
4459925
.NET Framework 2.0, 3.0 4457007
.NET Framework 4.5.2 4457019
.NET Framework 4.6 4457016

Previous Monthly Rollups

The last few .NET Framework Monthly updates are listed below for your convenience:

.NET Core October 2018 Update – NET Core 1.0 and 1.1

$
0
0

Today, we are releasing the .NET Core October 2018 Update for 1.0 and 1.1. This update includes .NET Core 1.0.13, 1.1.10 and .NET Core SDK 1.1.11.

Security

CVE-2018-8292: .NET Core Information Disclosure Vulnerability

Microsoft is aware of a security feature bypass vulnerability that exists when .NET Core when HTTP authentication information is inadvertently exposed in an outbound request that encounters an HTTP redirect. An attacker who successfully exploited this vulnerability could use the information to further compromise the web application.

The update addresses the vulnerability by correcting how .NET Core applications handles HTTP redirects.

Getting the Update

The latest .NET Core updates are available on the .NET Core download page.

Today’s releases are listed as follows:

Docker Images

.NET Docker images have been updated for today’s release. The following repos have been updated.

Note: Look at the “Tags” view in each repository to see the updated Docker image tags.

Note: You must re-pull base images in order to get updates. The Docker client does not pull updates automatically.

Azure App Services deployment

Deployment to Azure App Services has begun and the West Central US region will be live this morning. Remaining regions will be updated over the next few days and deployment is expected to be complete by end of week.

Previous .NET Core Updates

The last few .NET Core updates follow:

Using .NET Hardware Intrinsics API to accelerate machine learning scenarios

$
0
0

This week’s blog post is by Brian Lui, one of our summer interns on the .NET team, who’s been hard at work. Over to Brian:

Hello everyone! This summer I interned in the .NET team, working on ML.NET, an open-source machine learning platform which enables .NET developers to build and use machine learning models in their .NET applications. The ML.NET 0.6 release just shipped and you can try it out today.

At the start of my internship, ML.NET code was already relying on vectorization for performance, using a native code library. This was an opportunity to reimplement an existing codebase in managed code, using .NET Hardware Intrinsics for vectorization, and compare results.

What is vectorization, and what are SIMD, SSE, and AVX?

Vectorization is a name used for applying the same operation to multiple elements of an array simultaneously. On the x86/x64 platform, vectorization can be achieved by using Single Instruction Multiple Data (SIMD) CPU instructions to operate on array-like objects.

SSE (Streaming SIMD Extensions) and AVX (Advanced Vector Extensions) are the names for SIMD instruction set extensions to the x86 architecture. SSE has been available for a long time: the CoreCLR underlying .NET Core requires x86 platforms support at least the SSE2 instruction set. AVX is an extension to SSE that is now broadly available. Its key advantage is that it can handle 8 consecutive 32-bit elements in memory in one instruction, twice as much as SSE can.

.NET Core 3.0 will expose SIMD instructions as API’s that are available to managed code directly, making it unnecessary to use native code to access them.

ARM based CPU’s do offer a similar range of intrinsics but they are not yet supported on .NET Core (although work is in progress). Therefore, it is necessary to use software fallback code paths for the case when neither AVX nor SSE are available. The JIT makes it possible to do this fallback in a very efficient way. When .NET Core does expose ARM intrinsics, the code could exploit them at which point the software fallback would rarely if ever be needed.

Project goals

  1. Increase ML.NET platform reach (x86, x64, ARM32, ARM64, etc.) by creating a single managed assembly with software fallbacks
  2. Increase ML.NET performance by using AVX instructions where available
  3. Validate .NET Hardware Intrinsics API and demonstrate performance is comparable to native code

I could have achieved the second goal by simply updating the native code to use AVX instructions, but by moving to managed code at the same time I could eliminate the need to build and ship a separate binary for each target architecture – it’s also usually easier to maintain managed code.

I was able to achieve all these goals.

Challenges

It was necessary to first familiarize myself with C# and .NET, and then my work included:

  • use Span<T> in the base-layer implementation of CPU math operations in C#. If you’re unfamiliar with Span<T>, see this great MSDN magazine article C# – All About Span: Exploring a New .NET Mainstay and also the documentation.
  • enable switching between AVX, SSE, and software implementations depending on availability.
  • correctly handle pointers in the managed code, and remove alignment assumptions made by some of the existing code
  • use multitargeting to allow ML.NET continued to function on platforms that don’t have .NET Hardware Intrinsics APIs.

Multi-targeting

.NET Hardware Intrinsics will ship in .NET Core 3.0, which is currently in development. ML.NET also needs to run on .NET Standard 2.0 compliant platforms – such as .NET Framework 4.7.2 and .NET Core 2.1. In order to support both I chose to use multitargeting to create a single .csproj file that targets both .NET Standard 2.0 and .NET Core 3.0.

  1. On .NET Standard 2.0, the system will use the original native implementation with SSE hardware intrinsics
  2. On .NET Core 3.0, the system will use the new managed implementation with AVX hardware intrinsics.

As the code was originally

In the original code, every trainer, learner, and transform used in machine learning ultimately called a SseUtils wrapper method that performs a CPU math operation on input arrays, such as

  • MatMulDense, which takes the matrix multiplication of two dense arrays interpreted as matrices, and
  • SdcaL1UpdateSparse, which performs the update step of the stochastic dual coordinate ascent for sparse arrays.

These wrapper methods assumed a preference for SSE instructions, and called a corresponding method in another class Thunk, which serves as the interface between managed and native code and contains methods that directly invoke their native equivalents. These native methods in .cpp files in turn implemented the CPU math operations with loops containing SSE hardware intrinsics.

Breaking out a managed code-path

To this code I added a new independent code path for CPU math operations that becomes active on .NET Core 3.0, and by keeping the original code path running on .NET Standard 2.0. All previous call sites of SseUtilsmethods now called CpuMathUtils methods of the same name instead, keeping the API signatures of CPU math operations the same.

CpuMathUtils is a new partial class that contains two definitions for each public API representing CPU math operation, one of which is compiled only on .NET Standard 2.0 while the other, only on .NET Core 3.0. This conditional compilation feature creates two independent code paths for CpuMathUtils methods. Those function definitions compiled on .NET Standard 2.0 call their SseUtils counterparts directly, which essentially follow the original native code path.

Writing code with software fallback

On the other hand, the other function definitions compiled on .NET Core 3.0 switch to one of three implementations of the same CPU math operation, based on availability at runtime:

  1. an AvxIntrinsics method which implements the operation with loops containing AVX hardware intrinsics,
  2. a SseIntrinsics method which implements the operation with loops containing SSE hardware intrinsics, and
  3. a software fallback in case neither AVX nor SSE is supported.

You will commonly see this pattern whenever code uses .NET Hardware Intrinsics – for example, this is what the code looks like for adding a scalar to a vector:

If AVX is supported, it is preferred, otherwise SSE is used if available, otherwise the software fallback path. At runtime, the JIT will actually generate code for only one of these three blocks, as appropriate for the platform it finds itself on.

To give you an idea, here what the AVX implementation looks like that’s called by the method above:

You will notice that it operates on floats in groups of 8 using AVX, then any group of 4 using SSE, and finally a software loop for any that remain. (There are potentially more efficient ways to do this, which I won’t discuss here – there will be future blog posts dedicated to .NET Hardware Intrinsics.)

You can see all my code on the dotnet/machinelearning repository.

Since the AvxIntrinsics and SseIntrinsics methods in managed code directly implement the CPU math operations analogous to the native methods originally in .cpp files, the code change not only removes native dependencies but also simplifies the levels of abstraction between public APIs and base-layer hardware intrinsics.

After making this replacement I was able to use ML.NET to perform tasks such as train models with stochastic dual coordinate ascent, conduct hyperparameter tuning, and perform cross validation, on a Raspberry Pi, when previously ML.NET required an x86 CPU.

Here’s what the architecture looks like now (Figure 1):

Performance improvements

So what difference did this make to performance?

I wrote tests using Benchmark.NET to gather measurements.

First, I disabled the AVX code paths in order to fairly compare the native and managed implementations while both were using the same SSE instructions. As Figure 2 shows, the performance is closely comparable: on the large vectors the tests operate on, the overhead added by managed code is not significant.

Figure 2

Second, I enabled AVX support. Figure 3 shows that the average performance gain in microbenchmarks was about 20% over SSE alone.

Figure 3

Taking both together — the upgrade from the SSE implementation in native code to the AVX implementation in managed code — I measured an 18% improvement in the microbenchmarks. Some operations were up to 42% faster, while some others involving sparse inputs have potential for further optimization.

What ultimately matters of course is the performance for real scenarios. On .NET Core 3.0, training models of K-means clustering and logistic regression got faster by about 14% (Figure 4).

Figure 4

In closing

My summer internship experience with the .NET team has been rewarding and inspiring for me. My manager Dan and my mentors Santi and Eric gave me an opportunity to go hands-on with a real shipping project. I was able to work with other teams and external industry partners to optimize my code, and most importantly, as a software engineering intern with the .NET team, I was exposed to almost every step of the entire working cycle of a product enhancement, from idea generation to code review to product release with documentation.

I hope this has demonstrated how powerful .NET Hardware Intrinsics can be and I encourage you to consider opportunities to use them in your own projects when previews of .NET Core 3.0 become available.

Guidance for library authors

$
0
0

We’ve just published our first cut of the .NET Library Guidance. It’s brand new set of articles for .NET developers who want to create high-quality libraries for .NET. The guidance contains recommendations we’ve identified as common best practices that apply to most public .NET libraries.

We want to help .NET developers build great libraries with these aspects:

  • Inclusive – Good .NET libraries strive to support many platforms and applications.
  • Stable – Good .NET libraries coexist in the .NET ecosystem, running in applications built with many libraries.
  • Designed to evolve – .NET libraries should improve and evolve over time, while supporting existing users.
  • Debuggable – A high-quality .NET library should use the latest tools to create a great debugging experience for users.
  • Trusted – .NET libraries have developers’ trust by publishing to NuGet using security best practices.​

As well as a source of information, we hope the guidance can be a topic of discussion between Microsoft and the .NET open-source community: when creating a .NET library, what feels good and what are the points of friction. In recent years, Microsoft has made large investments in .NET tooling to make it easier to build .NET libraries, including cross-platform targeting, .NET Standard, and close integration with NuGet. We want your feedback to help improve .NET, and the .NET open-source ecosystem into the future.

Finally, the guidance isn’t complete. We want input from authors of .NET libraries to help improve and expand the documentation as .NET continues to grow and improve.

Video

.NET Conf 2018 featured a video that demonstrates many of the same guidelines:

Closing

Please check out the .NET Library Guidance and use the Feedback section to leave any comments. We hope to see your fantastic .NET libraries on NuGet soon!


Automating Release Notes with Azure Functions

$
0
0

We can all agree that tracking the progress of a project enhances productivity and is an effective way to keep everyone involved of its progress. When it comes to managing your project in Azure DevOps (formerly VSTS) or GitHub, you have all of your artifacts in one place: code, CI/CD pipelines, releases, work items, and more. In cases where there’s a larger project with a larger team, the rate at which pull requests and work items are created, opened, and closed will increase significantly between each release. Imagine a large user base that wanted to stay updated on these changes and updates through release notes. They’ll want to know if that pesky bug that was introduced in the last version got fixed this time, or if that feature they’re excited about finally made it out of beta.

Release notes tend to map directly to items that a team is tracking internally; I’d expect a work item on a bug high severity to make it into the release documentation. However, putting together release notes can be quite a challenge and very time consuming. When it’s time to ship new software updates, someone must manually go back in time to the last release, gather the relevant information, and compile it into a document to share with users. How do we keep the release information current and accurate for end users?

It’d be nice to automate the process of extracting information from completed work items and merged pull requests to create a document that outlines changes in a new release. This was the inspiration of the Release Notes Generator. With Azure Functions and Azure Blob Storage, the generator creates a markdown file every time a new release is created in Azure DevOps. In this post, we’ll walk through how the generator works, and use a sample DevOps project as an example for the generator. If you’d like a GitHub version, see the GitHub release notes generator sister post.

 

Overview of an Azure DevOps Project’s Work Items

 

View of Rendered Markdown version of release notes in VS Code with Markdown All in One Extension

 

The generator is an Azure Function app; Functions allow you to only pay for the time your code is running, so I’m only paying for the time it takes my code that generates notes to execute. With an HTTP triggered function, a webhook is configured in Azure DevOps to send an HTTP request to the function to kick off the notes generation process. Webhook configuration simply requires you to copy the url of the function that you’d like to send the endpoint to. An added benefit of using Azure Functions is that you can get started locally on your machine using or the  . You can create, debug, test, and deploy your function app all from the comfort of your own computer without even needing an Azure Subscription. You can test HTTP triggered functions locally with a tool like ngrok.

 

Local development of HTTP triggered function in Visual Studio

Out of all of Azure’s storage account offerings, blob storage is suited for serving and storing unstructured data objects like textual files, including the markdown representation of the release notes. The blob storage structure is similar to common file systems where objects, named blobs, are organized and stored in containers. This way, the release notes have a dedicated location inside a “releases” container. You can manage a storage account on your computer with the Azure Storage Explorer.

 

Release notes blobs in the releases container in Azure Storage Explorer

The release function uses the Azure Storage API to create a file and append text and links to a file. Interacting with blobs and blob containers through the API requires minimal setup; you just need the associated storage account connection string to get started.

Creating a new release file with Azure Storage API

Azure Functions are a quick and straightforward way to enhance your workflows with Azure DevOps webhooks. The release notes generator sample code is a good start if you’re interested in exploring the serverless possibilities that work for you. The sample includes instructions on how to run it in Visual Studio. Once you’ve got your own generator up and running, be sure to visit the docs and samples to see what else you can do.

Resources

Sample Code on GitHub

Overview of Azure DevOps Project

Azure Functions Documentation

Develop Azure Functions using Visual Studio

Code and test Azure Functions Locally

WebHooks with Azure DevOps Services

Quickstart: Use .NET to create a blob in object storage

Get started with Storage Explorer

Microsoft Learn Learning Path: Create Serverless Applications

GitHub Version sister post

Announcing Entity Framework Core 2.2 Preview 3

$
0
0

Today we are making EF Core 2.2 Preview 3 available, together with a new preview of our data provider for Cosmos DB and updated spatial extensions for various providers.

Preview 3 is going to be the last milestone before EF Core 2.2 RTM, so now is your last chance to try the bits and give us feedback if you want to have an impact on the quality and the shape of the APIs in this release.

Besides the new features, you can help by trying EF Core 2.2 preview 3 on applications that are using third party providers.  Although we now have our own testing for this, there might be unforeseen compatibility problems, and the earlier we can detect them, the higher chances we have of addressing them before RTM.

We thank you in advance for reporting any issues your find on our issue tracker on GitHub.

EF Core 2.2 roadmap update

EF Core 2.2 RTM is still planned for the end of the 2018 calendar year, alongside ASP.NET Core 2.2 and .NET Core 2.2.

However, based on a reassessment of the progress we have made so far, and on new information about the work we need to complete 2.2, we are no longer trying to include the following features in the EF Core 2.2 RTM:

  • Reverse engineering database views into query types: This feature is postponed to EF Core 3.0.
  • Cosmos DB Provider: Although we have made a lot of progress setting up the required infrastructure for document-oriented database support in EF Core, and have been steadily adding functionality to the provider, realistically we cannot arrive to a state in which we can release the provider with adequate functionality and quality in the current time frame for 2.2.Overall, we have found that the work necessary to complete the provider to be more than we initially estimated. Also, ongoing evolution in Cosmos DB is leading us to frequently revisit decisions about such things as how we use the Cosmos DB SDK, whether we map all entities to a single collection by default, etc.We plan to maintain the focus on the provider and to continue working with the Cosmos DB team and to keep releasing previews of the provider regularly. You can expect at least one more preview by the end of this year, and RTM sometime in 2019. We haven’t decided yet if the Cosmos DB provider will release as part of EF Core 3.0 or earlier.A good way to keep track of our progress is this checklist in our issue tracker.

Obtaining the preview

The preview bits are available on NuGet, and also as part of ASP.NET Core 2.2 Preview 3 and the .NET Core SDK 2.2 Preview 3, also releasing today. If you are want to try the preview in an application based on ASP.NET Core, we recommend you follow the instructions to upgrade to ASP.NET Core 2.2 Preview 3.

The SQL Server and in-memory providers are also included in ASP.NET Core, but for other providers and any other type of application, you will need to install the corresponding NuGet package.

For example, to add the 2.2 Preview 3 version of the SQL Server provider in a .NET Core library or application from the command line, use:

$ dotnet add package Microsoft.EntityFrameworkCore.SqlServer -v 2.2.0-preview3-35497

Or from the Package Manager Console in Visual Studio:

PM> Install-Package Microsoft.EntityFrameworkCore.SqlServer -Version 2.2.0-preview3-35497

For more details on how to add EF Core to your projects see our documentation on Installing Entity Framework Core.

The Cosmos DB provider and the spatial extensions ship as new separate NuGet packages. We’ll explain how to get started with them in the corresponding feature descriptions.

What is new in this preview?

Around 69 issues have been fixed since we finished Preview 2 last month. This includes product bug fixes and improvements to the new features. Specifically about the new features, the most significant changes are:

Spatial extensions

  • We have enabled spatial extensions to work with the SQLite provider using the popular SpatiaLite library.
  • We switched the default mapping of spatial properties on SQL Server from geometry to geography columns.
  • In order to use spatial extensions correctly with preview 3, it is recommended that you use the GeometryFactory provided by NetTopologySuite instead of creating new instances directly.
  • We collaborated with the NetTopologySuite team to create NetTopologySuite.IO.SqlServerBytes — a new IO module that targets .NET Standard and works directly with the SQL Server serialization format.
  • We enabled reverse engineering for databases containing spatial columns. Just make sure you add the spatial extension package for your database provider before you run Scaffold-DbContext or dotnet ef dbcontext scaffold.

Here is an updated usage example:


// Model class
public class Friend
{
  [Key]
  public string Name { get; set; }

  [Required]
  public IPoint Location { get; set; }
}

// Program
private static void Main(string[] args)
{
     // Create spatial factory
     var geometryFactory = NtsGeometryServices.Instance.CreateGeometryFactory(srid: 4326);

     // Setup data in datbase
     using (var context = new MyDbContext())
     {
         context.Database.EnsureDeleted();
         context.Database.EnsureCreated();

         context.Add(
             new Friend
             {
                 Name = "Bill",
                 Location = geometryFactory.CreatePoint(new Coordinate(-122.34877, 47.6233355))
             });
         context.Add(
             new Friend
             {
                 Name = "Paul",
                 Location = geometryFactory.CreatePoint(new Coordinate(-122.3308366, 47.5978429))
             });
         context.SaveChanges();
     }

     // find nearest friends
     using (var context = new MyDbContext())
     {
         var myLocation = geometryFactory.CreatePoint(new Coordinate(-122.13345, 47.6418066));

         var nearestFriends =
             (from f in context.Friends
              orderby f.Location.Distance(myLocation) descending
              select f).Take(5);

         Console.WriteLine("Your nearest friends are:");
         foreach (var friend in nearestFriends)
         {
             Console.WriteLine($"Name: {friend.Name}.");
         }
     }
}

In order to use this code with SQL Server, simply install the 2.2 preview 3 version of the Microsoft.EntityFrameworkCore.SqlServer.NetTopologySuite NuGet package, and configure your DbContext as follows:


public class MyDbContext : DbContext
{
    public DbSet Friends { get; set; }

    protected override void OnConfiguring(DbContextOptionsBuilder options)
    {
        options.UseSqlServer(
            "Server=(localdb)\\mssqllocaldb;Database=SpatialFriends;ConnectRetryCount=0",
            b => b.UseNetTopologySuite());

    }
}

In order to use this code with SQLite, you can install the 2.2 preview 3 version of the Microsoft.EntityFrameworkCore.Sqlite.NetTopologySuite and Microsoft.EntityFrameworkCore.Sqlite packages. Then you can configure the DbContext like this:


public class MyDbContext : DbContext
{
    public DbSet Friends { get; set; }

    protected override void OnConfiguring(DbContextOptionsBuilder options)
    {
        options.UseSqlite(
            "Filename=SpatialFriends.db",
            x => x.UseNetTopologySuite());
    }

    protected override void OnModelCreating(ModelBuilder modelBuilder)
    {
        // For SQLite, you need to configure reference system on column
        modelBuilder
            .Entity()
            .Property(f => f.Location)
            .ForSqliteHasSrid(4326);
    }
}

Note that the spatial extension for SQLite requires the SpatiaLite library. This will be added as a dependency by the NuGet packages previously mentioned if you are on Windows, but on other systems you will need extra steps. For example:

  • On MacOS:
    $ brew install libspatialite
  • On Ubuntu or Debian Linux:
    $ apt-get install libsqlite3-mod-spatialite

In order to use this code with the in-memory provider, simply install the NetTopologySuite package, and the 2.2 preview 3 version of the Microsoft.EntityFrameworkCore.InMemory package. Then you can simply configure the DbContext like this:


public class MyDbContext : DbContext
{
    public DbSet Friends { get; set; }

    protected override void OnConfiguring(DbContextOptionsBuilder options)
    {
        options.UseInMemoryDatabase("SpatialFriends");
    }
}

Cosmos DB provider

We have made several changes and improvements since preview 2:

  • The package name has been renamed to Microsoft.EntityFrameworkCore.Cosmos
  • The UseCosmosSql() method has been renamed to UseCosmos()
  • We now store owned entity references and collections in the same document as the owner
  • Queries can now be executed asynchronously
  • SaveChanges(), EnsureCreated(), and EnsureDeleted() can now be executed synchronously
  • You no longer need to manually generate unique key values for entities
  • We preserve values in non-mapped properties when we update documents
  • We added a ToContainer() API to map entity types to a Cosmos DB container (or collection) explicitly
  • We now use the name of the derived DbContext type, rather ‘Unicorn’ for the container or collection name we use by convention
  • We enabled various existing features to work with Cosmos DB, including retrying execution strategies, and data seeding

We still have some pending work and several limitations to remove in the provider. Most of them as tracked as uncompleted tasks on our task list. In addition to those:

  • Currently, synchronous methods are much slower than the corresponding asynchronous methods
  • The value of the ‘id’ property has to be specified for seeding
  • There is currently no enforcement of uniqueness of primary keys values on entities saved by multiple instances of the DbContext

In order to use the provider, install the 2.2 preview 3 version of the Microsoft.EntityFrameworkCore.Cosmos package.

The following example configures the DbContext to connect to the Cosmos DB local emulator to store a simple blogging model:


public class BloggingContext : DbContext
{
  public DbSet Blogs { get; set; }
  public DbSet Posts { get; set; }

  protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
  {
    optionsBuilder.UseCosmos(
      "https://localhost:8081",
      "C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==",
      "MyDocuments");
  }
}

public class Blog
{
  public int BlogId { get; set; }
  public string Name { get; set; }
  public string Url { get; set; }
  public List Posts { get; set; }
}

public class Post
{
  public int PostId { get; set; }
  public string Title { get; set; }
  public string Content { get; set; }
  public List Tags { get; set; }
}

[Owned]
public class Tag
{
    [Key]
    public string Name { get; set; }
}

If you want, you can create the database programmatically, using EF Core APIs:


using (var context = new BloggingContext())
{
  context.Database.EnsureCreated();
}

Once you have connected to an existing database and you have defined your entities, you can start storing data in the database, for example:


using (var context = new BloggingContext())
{
  context.Blogs.Add(
    new Blog
    {
        BlogId = 1,
        Name = ".NET Blog",
        Url = "https://blogs.msdn.microsoft.com/dotnet/",
        Posts = new List
        {
            new Post
            {
                PostId = 2,
                Title = "Welcome to this blog!",
                Tags = new List
                {
                    new Tag
                    {
                        Name = "Entity Framework Core"
                    },
                    new Tag
                    {
                        Name = ".NET Core"
                    }
                }
            },
        }
      }
    });
  context.SaveChanges();
}

And you can write queries using LINQ:


var dotNetBlog = context.Blogs.Single(b => b.Name == ".NET Blog");

Query tags

  • We fixed several issues with multiple calls of the API and with usage with multi-line strings
  • The API was renamed to TagWith()

This an updated usage example:


  var nearestFriends =
      (from f in context.Friends.TagWith(@"This is my spatial query!")
      orderby f.Location.Distance(myLocation) descending
      select f).Take(5).ToList();

This will generate the following SQL output:


-- This is my spatial query!

SELECT TOP(@__p_1) [f].[Name], [f].[Location]
FROM [Friends] AS [f]
ORDER BY [f].[Location].STDistance(@__myLocation_0) DESC

Collections of owned entities

  • The main update since preview 2 is that the Cosmos DB provider now stores owned collections as part of the same document as the owner.

Here is a simple usage scenario:


modelBuilder.Entity().OwnsMany(c => c.Addresses);

Thank you

The EF team would like to thank everyone for all the feedback and contributions. Once more, please try this preview and report any feedback on our issue tracker.

Announcing .NET Core 2.2 Preview 3

$
0
0

Today, we are announcing .NET Core 2.2 Preview 3We have made more improvements to the overall release that we would love to get your feedback on, either in the comments or at dotnet/core #2004.

ASP.NET Core 2.2 Preview 3 and Entity Framework 2.2 Preview 3 were also released today.

You can see more details of the release in the .NET Core 2.2 Preview 3 release notes. Related instructions, known issues, and workarounds are included in the releases notes. Please report any issues you find in the comments or at  dotnet/core #2004.

Please see the .NET Core 2.2 Preview 2 post to learn more about the new features coming with .NET Core 2.2.

Thanks for everyone that contributed to .NET Core 2.2. You’ve helped make .NET Core a better product!

Download .NET Core 2.2

You can download and get started with .NET Core 2.2, on Windows, macOS, and Linux:

Docker images are available at microsoft/dotnet for .NET Core and ASP.NET Core.

.NET Core 2.2 Preview 3 can be used with Visual Studio 15.9 Preview 3 (or later), Visual Studio for Mac and Visual Studio Code.

Platform Support

.NET Core 2.2 is supported on the following operating systems:

  • Windows Client: 7, 8.1, 10 (1607+)
  • Windows Server: 2008 R2 SP1+
  • macOS: 10.12+
  • RHEL: 6+
  • Fedora: 27+
  • Ubuntu: 14.04+
  • Debian: 8+
  • SLES: 12+
  • openSUSE: 42.3+
  • Alpine: 3.7+

Chip support follows:

  • x64 on Windows, macOS, and Linux
  • x86 on Windows
  • ARM32 on Linux (Ubuntu 18.04+, Debian 9+)

Closing

Please download and test .NET Core 2.2 Preview 3. We’re looking for feedback on the release with the intent of shipping the final version later this year.

.NET Framework October 2018 Preview of Quality Rollup

$
0
0

Today, we are releasing the October 2018 Preview of Quality Rollup.

Quality and Reliability

This release contains the following quality and reliability improvements.

CLR

  • Updated Japanese dates that are formatted for the first year in an era and for which the format pattern uses “y年”. The format of the year together with the symbol “元” is supported instead of using year number 1. Also, formatting day numbers that include “元” is supported. [646179]
  • Updated Venezuela currency information, this change affected the culture of “es-VE” in the following ways. [616146]
    1) currency symbol changed to “Bs.S”
    2) English currency name is changed to “Bolívar Soberano”
    3) Native Currency name is changed to “bolívar soberano”
    4) Intl Currency Code changed to “VES”
  • Addressed a situation where the System.Security.Cryptography.Algorithms reference was not correctly loaded on .NET Framework 4.7.1 after the 7B/8B patch. [673870]

WF

  • In some .NET Remoting scenarios, when using TransactionScopeAsyncFlowOption.Enabled, it was possible to have Transaction.Current reset to null after a remoting call. [669153]

WPF

  • Addressed an issue where application created numerous Windows Forms textboxes to a flowLayoutPanel, with only a few calls to comctl32.dll. [638365]
  • Addressed a race condition involving temporary files and some anti-virus scanners. This was causing crashes with the message “The process cannot access the file “. [638468]
  • Addressed a crash due to TaskCanceledException that can occur during shutdown of some WPF apps. Apps that continue to do work involving weak events or data binding after Application.Run() returns are known to be vulnerable to this crash. [655427]

Note: Additional information on these improvements is not available. The VSTS bug number provided with each improvement is a unique ID that you can give Microsoft Customer Support, include in StackOverflow comments or use in web searches.

Getting the Update

The Security and Quality Rollup is available via Windows Update, Windows Server Update Services, and Microsoft Update Catalog.

Microsoft Update Catalog

You can get the update via the Microsoft Update Catalog. For Windows 10, .NET Framework updates are part of the Windows 10 Monthly Rollup.

The following table is for Windows 10 and Windows Server 2016+ versions.

Product Version Preview of Quality Rollup KB
Windows 10 1709 (Fall Creators Update) Catalog
4462932
.NET Framework 3.5, 4.7.1, 4.7.2 4462932
Windows 10 1703 (Creators Update) Catalog
4462939
.NET Framework 3.5, 4.7, 4.7.1, 4.7.2 4462939
Windows 10 1607 (Anniversary Update) Catalog
4462930
.NET Framework 3.5, 4.6.2, 4.7, 4.7.1, 4.7.2 4462930

The following table is for earlier Windows and Windows Server versions.

Product Version Preview of Quality Rollup KB
Windows 8.1
Windows RT 8.1
Windows Server 2012 R2
Catalog
4462502
.NET Framework 3.5 4459935
.NET Framework 4.5.2 4459943
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.1, 4.7.2 4459941
Windows Server 2012 Catalog
4462501
.NET Framework 3.5 4459932
.NET Framework 4.5.2 4459944
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.1, 4.7.2 4459940
Windows 7
Windows Server 2008 R2
Catalog
4462500
.NET Framework 3.5.1 4459934
.NET Framework 4.5.2 4459945
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.1, 4.7.2 4459942
Windows Server 2008 Catalog
4462503
.NET Framework 2.0, 3.0 4459933
.NET Framework 4.5.2 4459945
.NET Framework 4.6 4459942

Previous Monthly Rollups

The last few .NET Framework Monthly updates are listed below for your convenience:

Call For Participation: .NET and TypeScript at FOSDEM conference

$
0
0

The organizers of the FOSDEM 2019 conference have allocated a “Developer Room” on Saturday February 2nd 2019 for .NET and TypeScript discussions in Brussels Belgium.   FOSDEM is one of Europe’s most exciting free software conferences that runs over a weekend and gathers many open source communities, contributors and activists in one place to learn about the state of the world.

We are looking for:

  • Keynote presenters, with bold and vibrant ideas to share with the community at large
  • Technical presentations (30 minutes, including questions and discussions) related to .NET, C#, F#, TypeScript
  • Presentations about the use of .NET or TypeScript for commercial, academic, hobbyists and other projects
  • Tutorials
  • Lightning talks (5 minutes each)

This is a partial list of ideas that might be of interest to the audience, it is by no means comprehensive, so if your project or ideas is not included, feel free to submit a proposal anyways:

  • Innovative ideas, fresh new takes on old problems for server applications, client applications.
  • Interoperability with other ecosystems
  • Best practices, code quality, testing,
  • Building software for users
  • Compilers, Runtimes, Libraries
  • IDEs
  • Tips and Tricks

Submission

To submit a talk proposal, please include:

  • Title
  • Abstract (at least two paragraphs)
    • Describe what your project or presentation is about.
    • What the audience will learn from the talk.
    • How can people help out with the project, what feedback are you looking for?
  • Recording me on audio and/or video
    • acceptable under a CC-BY-2.0 license (DEFAULT)
    • not acceptable
  • Brief Bio
  • Microblog URL
  • Blog URL

To submit your proposal, please create an account on the FOSDEM site at https://penta.fosdem.org/user/new_account and submit your proposal at https://penta.fosdem.org/submission/FOSDEM19 and make sure you flag it as being part of the .NET and TypeScript developer room.

Deadline

The The deadline for receiving submissions is December 3rd, 2018. Speakers will be notified of acceptance or rejection by the 18th of December.

 

Viewing all 4000 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>