SQL Server auto generate UNIQUEIDENTIFIER

 Insert data into a table

✅ Have SQL Server auto-generate both:

  • Id (as INT IDENTITY)

  • RowGuid (as UNIQUEIDENTIFIER, using NEWID() — not Guid.NewGuid() from C#)
    ✅ Insert via stored procedure
    ✅ Return the newly inserted row (including Id and RowGuid) to your .NET Core API


✅ Full SQL Server Setup

🔹 1. Create Table (with auto-generated RowGuid)

CREATE TABLE Items ( Id INT IDENTITY(1,1) PRIMARY KEY, RowGuid UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID(), Name NVARCHAR(100), CreatedAt DATETIME NOT NULL DEFAULT GETUTCDATE() );

🧠 Explanation:

ColumnDescription
IdAuto-increment primary key
RowGuidAuto-generated using SQL Server's NEWID()
CreatedAtSet automatically to current UTC time

🔹 2. Create Stored Procedure (only passes @Name)

CREATE PROCEDURE InsertItem @Name NVARCHAR(100) AS BEGIN SET NOCOUNT ON; INSERT INTO Items (Name) VALUES (@Name); -- Return the inserted row using SCOPE_IDENTITY() SELECT * FROM Items WHERE Id = SCOPE_IDENTITY(); END

🧠 Why SCOPE_IDENTITY()?

It returns the Id of the row just inserted, safely and accurately.


✅ .NET Core API Integration

🔹 1. DTO (for request)

public class ItemDto { public string Name { get; set; } }

🔹 2. Entity (for mapping result)

public class Item { public int Id { get; set; } public Guid RowGuid { get; set; } public string Name { get; set; } public DateTime CreatedAt { get; set; } }

🔹 3. API Controller Method

[HttpPost("create")] public IActionResult CreateItem([FromBody] ItemDto dto) { var result = _context.Items .FromSqlRaw("EXEC InsertItem @p0", dto.Name) .AsEnumerable() .FirstOrDefault(); return Ok(result); }

This uses FromSqlRaw() to call the stored procedure and fetch the inserted row.


✅ Example Request

POST /api/items/create Content-Type: application/json { "name": "My First Item" }

✅ Example Response

{ "id": 42, "rowGuid": "ebf679a1-9dc5-4f0e-b0a2-45c97a2a7c77", "name": "My First Item", "createdAt": "2025-09-26T16:00:00Z" }

✅ Benefits of This Approach

FeatureWhy it's good
No Guid.NewGuid() in C#You let SQL Server handle it (DEFAULT NEWID())
Clean APIClient just sends the name
Secure and consistentRowGuid is unguessable, great for future updates
Lightweight SPOnly handles what it needs: name and insert logic

Let me know if you'd also like to:

  • Use NEWSEQUENTIALID() for better index performance

  • Add update/delete stored procedures

  • Generate SQL script from EF Core (.HasDefaultValueSql("NEWID()"))

different advanced topics — each with a real-world use case, code example, and detailed explanation. These examples build on what you already know (SELECT, JOIN, INSERT, UPDATE), but show how to go further with SQL Server's features.

 

🔸 1. Common Table Expressions (CTEs)

✅ Use Case: Get the second most recent order per customer.

WITH OrderRanks AS ( SELECT CustomerID, OrderID, OrderDate, RANK() OVER (PARTITION BY CustomerID ORDER BY OrderDate DESC) AS OrderRank FROM Orders ) SELECT * FROM OrderRanks WHERE OrderRank = 2;

🔍 Explanation:

  • WITH OrderRanks AS (...) defines a CTE.

  • RANK() assigns ranks per customer based on OrderDate.

  • We filter only for OrderRank = 2 → gives the second latest order per customer.


🔸 2. CASE Statements (Conditional Logic)

✅ Use Case: Categorize customers by country.

SELECT CustomerID, Name, Country, CASE WHEN Country IN ('USA', 'Canada') THEN 'North America' WHEN Country IN ('France', 'Germany') THEN 'Europe' ELSE 'Other' END AS Region FROM Customers;

🔍 Explanation:

  • CASE adds if-else logic inside a query.

  • This example labels customers by region.


🔸 3. Window Functions – LAG/LEAD

✅ Use Case: Compare each customer’s latest order to their previous one.

SELECT CustomerID, OrderDate, TotalAmount, LAG(TotalAmount) OVER (PARTITION BY CustomerID ORDER BY OrderDate) AS PreviousAmount FROM Orders;

🔍 Explanation:

  • LAG() looks backward at the previous row.

  • Shows how a customer’s spending changed from their last order.


🔸 4. Stored Procedure with Parameters

✅ Use Case: Create a procedure that shows all orders above a certain amount.

CREATE PROCEDURE GetLargeOrders @MinAmount DECIMAL(10, 2) AS BEGIN SELECT * FROM Orders WHERE TotalAmount > @MinAmount END;

To run:

EXEC GetLargeOrders @MinAmount = 500;

🔍 Explanation:

  • Stored procedures let you encapsulate queries for reuse.

  • You can pass input via parameters like @MinAmount.


🔸 5. Triggers

✅ Use Case: Automatically log when a customer is deleted.

CREATE TABLE CustomerLogs ( CustomerID INT, DeletedAt DATETIME ); CREATE TRIGGER trg_CustomerDelete ON Customers AFTER DELETE AS BEGIN INSERT INTO CustomerLogs (CustomerID, DeletedAt) SELECT CustomerID, GETDATE() FROM DELETED; END;

🔍 Explanation:

  • This AFTER DELETE trigger fires automatically.

  • It inserts a record into CustomerLogs every time a customer is deleted.


🔸 6. Transactions (Atomic Operations)

✅ Use Case: Insert order and details together, or not at all.

BEGIN TRY BEGIN TRANSACTION; INSERT INTO Orders (CustomerID, OrderDate, TotalAmount) VALUES (1, GETDATE(), 300); DECLARE @OrderID INT = SCOPE_IDENTITY(); INSERT INTO OrderDetails (OrderID, ProductID, Quantity, Price) VALUES (@OrderID, 1001, 2, 150); COMMIT; END TRY BEGIN CATCH ROLLBACK; PRINT 'Something went wrong: ' + ERROR_MESSAGE(); END CATCH;

🔍 Explanation:

  • Ensures that both Orders and OrderDetails insert succeed together.

  • SCOPE_IDENTITY() grabs the last inserted ID.

  • If anything fails, ROLLBACK cancels it all.


🔸 7. Views

✅ Use Case: Create a simplified, reusable customer order summary.

CREATE VIEW vw_CustomerSummary AS SELECT c.Name, o.OrderID, o.OrderDate, o.TotalAmount FROM Customers c JOIN Orders o ON c.CustomerID = o.CustomerID;
-- Use the view SELECT * FROM vw_CustomerSummary WHERE TotalAmount > 1000;

🔍 Explanation:

  • Views let you hide complexity.

  • Acts like a virtual table based on SELECT logic.


🔸 8. Indexing

✅ Use Case: Speed up queries filtering on OrderDate.

CREATE NONCLUSTERED INDEX IX_Orders_OrderDate ON Orders(OrderDate);

🔍 Explanation:

  • Indexes improve performance on WHERE, JOIN, and ORDER BY clauses.

  • But too many indexes can slow down inserts/updates.


🔸 9. Temporary Tables

✅ Use Case: Store intermediate results for complex queries.

SELECT CustomerID, SUM(TotalAmount) AS TotalSpent INTO #CustomerTotals FROM Orders GROUP BY CustomerID; SELECT * FROM #CustomerTotals WHERE TotalSpent > 1000;

🔍 Explanation:

  • #CustomerTotals is a temporary table that exists during the session.

  • Useful for breaking down complex steps.


🔸 10. Dynamic SQL

✅ Use Case: Build a query based on input values.

DECLARE @TableName NVARCHAR(100) = 'Customers'; DECLARE @SQL NVARCHAR(MAX); SET @SQL = 'SELECT * FROM ' + QUOTENAME(@TableName); EXEC sp_executesql @SQL;

🔍 Explanation:

  • Lets you build and execute queries at runtime.

  • Useful when table names or columns are dynamic.


✅ Summary Table

TopicKey FeatureUse Case
CTETemporary result setSecond latest order per customer
CASEConditional logicLabel customers by region
Window FunctionsLAG / LEADCompare current vs previous order
Stored ProcedureParameterized logicReusable large order query
TriggerAutomatic actionLog customer deletions
TransactionAtomic multi-step opsInsert order & details together
ViewVirtual tableSimplify query logic
IndexPerformance optimizationFast filtering on OrderDate
Temp TableSession-level storageIntermediate data processing
Dynamic SQLRuntime SQL generationQuery variable table names

How can increase SelectPdf 5 limits to 10 pages using PdfSharp.Pdf with examples Dot Net

SelectPdf Free Edition supports up to 5 pages per document. That’s why we generate two parts and merge.


Install these 

dotnet add package Select.Pdf.NetCore

dotnet add package PdfSharp


Full Working Code


using System;

using System.IO;

using System.Net;

using System.Text;

using SelectPdf;

using PdfSharp.Pdf;

using PdfSharp.Pdf.IO;


class HtmlAgreementPdfGenerator

{

    static void Main()

    {

        string base64Merged = GenerateMergedPdfFromAgreement();

        File.WriteAllBytes("FinalAgreement.pdf", Convert.FromBase64String(base64Merged));

        Console.WriteLine("✅ Merged PDF saved as 'FinalAgreement.pdf'");

    }


    public static string GenerateMergedPdfFromAgreement()

    {

        ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;


        // Generate first 5 pages (allowed by SelectPdf free version)

        StringBuilder htmlFirst5Pages = new StringBuilder();

        for (int i = 1; i <= 5; i++)

        {

            htmlFirst5Pages.Append(BuildAgreementHtml(i));

        }


        // Generate remaining 2 pages

        StringBuilder htmlLast2Pages = new StringBuilder();

        for (int i = 6; i <= 7; i++)

        {

            htmlLast2Pages.Append(BuildAgreementHtml(i));

        }


        // Convert both HTML segments to Base64 PDF

        string base64Pdf1to5 = ConvertHtmlToBase64Pdf(htmlFirst5Pages.ToString());

        string base64Pdf6to7 = ConvertHtmlToBase64Pdf(htmlLast2Pages.ToString());


        // Merge both into one Base64 PDF

        return MergeBase64Pdfs(base64Pdf1to5, base64Pdf6to7);

    }


    private static string BuildAgreementHtml(int pageNumber)

    {

        StringBuilder sb = new StringBuilder();

        sb.Append("<html><head><style>");

        sb.Append("body { font-family: Arial; line-height: 1.6; margin: 40px; }");

        sb.Append("h1, h2 { color: #333; }");

        sb.Append("p { text-align: justify; }");

        sb.Append("</style></head><body>");


        sb.Append($"<h1>Service Agreement - Page {pageNumber}</h1>");

        sb.Append("<h2>1. Introduction</h2>");

        sb.Append("<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Integer nec odio. Praesent libero. Sed cursus ante dapibus diam.</p>");


        sb.Append("<h2>2. Terms and Conditions</h2>");

        for (int i = 0; i < 5; i++)

        {

            sb.Append("<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.</p>");

        }


        sb.Append("<h2>3. Confidentiality</h2>");

        for (int i = 0; i < 3; i++)

        {

            sb.Append("<p>Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.</p>");

        }


        sb.Append("<h2>4. Termination</h2>");

        sb.Append("<p>Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.</p>");


        sb.Append("<h2>5. Signatures</h2>");

        sb.Append("<p>Signed by both parties as a binding agreement.</p>");


        sb.Append("</body></html>");

        return sb.ToString();

    }


    private static string ConvertHtmlToBase64Pdf(string html)

    {

        HtmlToPdf converter = new HtmlToPdf();

        converter.Options.PdfPageSize = PdfPageSize.A4;

        converter.Options.WebPageWidth = 1024;

        converter.Options.MarginTop = 20;

        converter.Options.MarginBottom = 20;


        PdfDocument doc = converter.ConvertHtmlString(html);

        using (MemoryStream ms = new MemoryStream())

        {

            doc.Save(ms);

            doc.Close();

            return Convert.ToBase64String(ms.ToArray());

        }

    }


    private static string MergeBase64Pdfs(string base64Pdf1, string base64Pdf2)

    {

        byte[] pdfBytes1 = Convert.FromBase64String(base64Pdf1);

        byte[] pdfBytes2 = Convert.FromBase64String(base64Pdf2);


        PdfDocument input1 = PdfReader.Open(new MemoryStream(pdfBytes1), PdfDocumentOpenMode.Import);

        PdfDocument input2 = PdfReader.Open(new MemoryStream(pdfBytes2), PdfDocumentOpenMode.Import);


        PdfDocument merged = new PdfDocument();

        for (int i = 0; i < input1.PageCount; i++)

            merged.AddPage(input1.Pages[i]);

        for (int i = 0; i < input2.PageCount; i++)

            merged.AddPage(input2.Pages[i]);


        using (MemoryStream outputMs = new MemoryStream())

        {

            merged.Save(outputMs);

            return Convert.ToBase64String(outputMs.ToArray());

        }

    }

}

using SQL Server, and you want to import 20+ tables from Production into UAT in one go, here’s a streamlined process using SQL Server Management Studio (SSMS).

 

Goal: Import All Tables from Production into UAT (One Time)

🔧 Method 1: Generate Scripts (Schema + Data)

This is the cleanest way to export everything from Production and re-create it in UAT.

🔁 Steps:

  1. Open SSMS and connect to your Production DB.

  2. Right-click on the Production database > go to:

    Tasks > Generate Scripts
  3. Select "Specific database objects" → Choose all the tables (you can multi-select).

  4. Click Next until you reach Set Scripting Options:

    • Click Advanced

    • Set “Types of data to script” to:

      Schema and Data
  5. Choose output:

    • Save to file or Clipboard (recommended: file for large exports).

  6. Finish the wizard — this generates a .sql script file with:

    • All table definitions

    • All data insert statements

  7. Switch to your UAT database in SSMS.

  8. Run the generated script in UAT.


🧪 Method 2: Import Data via SSMS Import Wizard (Only Data)

If your UAT already has the same tables and you only want to copy the data, use this.

🔁 Steps:

  1. Open SSMS.

  2. Connect to the UAT database.

  3. Right-click UAT DB > go to:

    Tasks > Import Data
  4. In the Import Wizard:

    • Source: Select SQL Server as source, point it to Production DB.

    • Destination: Point to UAT DB.

  5. Choose Copy data from one or more tables.

  6. Select the tables (you can check all 20+).

  7. Review column mappings (optional).

  8. Run the import.

✅ This copies data only, not schema.


🧠 Tip: Use Linked Server (Advanced Option)

If this is something you'll do repeatedly, consider setting up a linked server between UAT and Production and then use:

INSERT INTO UATDB.dbo.TableName SELECT * FROM [ProdServer].[ProdDB].dbo.TableName

📌 Summary

MethodSchemaDataTools Required
Generate ScriptsSSMS
Import WizardSSMS
Linked Server + SQL InsertSSMS + setup

Would you like a sample script or wizard screenshots to guide you through this?

You said:
Want to uat to production only selected db
ChatGPT said:

Thanks — now you're saying the opposite of before:

✅ You want to copy selected databases (or specific tables) from UAT → Production, not the other way around.


✅ Goal: Deploy Selected Tables (or Data) from UAT to Production in SQL Server

You can do this safely and selectively, but it depends whether you want to copy:

  1. Table schema only (table definitions)

  2. Data only

  3. Both schema + data

  4. Stored procedures, views, etc.

I'll guide you through the best method using SQL Server Management Studio (SSMS).


🧰 Option 1: Use Generate Scripts Wizard (Recommended for small sets)

Works for: Specific tables, stored procs, data, etc.

🔁 Steps:

  1. In SSMS, connect to UAT database.

  2. Right-click the UAT DB > go to:

    Tasks > Generate Scripts
  3. Choose “Select specific database objects”

    • Pick only the tables, procs, etc. you want to copy.

  4. Click Next, then go to Advanced.

    • Set:

      • Types of data to script = Schema and Data

      • Script DROP and CREATE = CREATE only (unless you want to overwrite)

  5. Save the script to a .sql file.

  6. Open SSMS, connect to Production, and run the script carefully.

✅ This way, only selected objects are copied to Production.


⚙️ Option 2: Use Export Data Wizard (Data Only)

Use when UAT & Production already have the same schema, and you only want to move data.

  1. In SSMS, right-click the UAT database > go to:

    Tasks > Export Data
  2. Choose:

    • Source = UAT

    • Destination = Production

  3. Choose tables to copy

  4. Run the wizard.

✅ This will copy only data into production tables (be cautious — it can overwrite!).


🛡️ Important Notes

  • NEVER run directly in Production without:

    • Backup

    • Testing the script in a staging environment

  • Use transactions when running custom scripts:

BEGIN TRANSACTION -- Your insert/update/delete statements -- COMMIT only if verified COMMIT -- Or rollback if issue -- ROLLBACK

🧪 Option 3: Custom Script to Copy UAT Data to Production

If you only want a few tables:

-- From UAT DB to Production DB -- Make sure both DBs are on same SQL Server INSERT INTO [ProductionDB].[dbo].[YourTable] SELECT * FROM [UATDB].[dbo].[YourTable]

Or, if on different servers, set up a linked server, then:

INSERT INTO ProdServer.ProductionDB.dbo.YourTable SELECT * FROM UATServer.UATDB.dbo.YourTable

🔐 Final Checklist Before Updating Production

✅ Review all scripts
✅ Take production backup
✅ Test in staging/UAT
✅ Use transactions
✅ Inform stakeholders

Useful VS Code Extensions / Packages

 

NameWhat it doesWhy helpful for checking commits / trees
GitLensEnhances built‑in Git support: inline blame, commit graph, file history, authorship, etc. GitKraken+2gitlens.amod.io+2You can see a visual graph of commit history, check who made what changes, when. Very handy to determine “best commits” or compare contributions.
Git GraphShows a graphical history of your repository — commits, branches, merges, etc. gdevops.frama.ioUseful for seeing branching and merging structure, visually seeing where work diverged and what commits are where.
Git Tree CompareAllows you to diff your working tree against a branch, tag, or commit in a tree‑view form. VsixHub.com+1Helps you see all file‑level changes at once in a tree view when comparing versions. Good for pull requests / merge previews.
RepoTreeGenerates a visual tree representation of directories/files in your repo. GitHubIf you want to see structure of the project (which files/folders exist) easily, helps with navigation and code review.
Project TreeLets you maintain a tree layout of the project in explorer pane; remembers open files/tabs etc. GitHubUseful for getting consistency in how different team members see structure, also helps navigate large projects.

🔧 If you meant full Visual Studio (Windows, .NET)

If you meant Visual Studio (not VS Code), there are things like:

  • Git Tools / Git Extensions for Visual Studio — built‑in or plugin‑add ons that show history, diffs, authorship.

  • SourceTree or TortoiseGit — external GUIs that can show visual trees of commits, branches; you can use them alongside Visual Studio.

  • CodeLens (in Visual Studio) — shows in the code editor who changed lines, when, etc.

HiQPdf – Convert URL to PDF and Stream to Browser (ASP.NET)

 using System;

using System.Web;

using HiQPdf;


public class PdfController : System.Web.UI.Page

{

    protected void Page_Load(object sender, EventArgs e)

    {

        // Example URL to convert — could come from query string or form input

        string targetUrl = "https://example.com";


        // Initialize the converter

        HtmlToPdf converter = new HtmlToPdf();


        // Optional: Set page size, orientation

        converter.Document.PageSize = PdfPageSize.A4;

        converter.Document.PageOrientation = PdfPageOrientation.Portrait;


        // Optional: Web page rendering options

        converter.Document.WebPageWidth = 1024;

        converter.Document.WebPageHeight = 0; // Auto height


        try

        {

            // Convert URL to PDF byte array

            byte[] pdfBytes = converter.ConvertUrlToMemory(targetUrl);


            // Set HTTP response headers for download

            Response.Clear();

            Response.ContentType = "application/pdf";

            Response.AddHeader("Content-Disposition", "inline; filename=GeneratedDocument.pdf");

            Response.AddHeader("Content-Length", pdfBytes.Length.ToString());


            // Write PDF to response stream

            Response.BinaryWrite(pdfBytes);

            Response.Flush();

            Response.End(); // End the response to prevent further rendering

        }

        catch (Exception ex)

        {

            Response.Write("Error generating PDF: " + ex.Message);

        }

    }

}



protected void btnGeneratePdf_Click(object sender, EventArgs e)

{

    string targetUrl = txtUrl.Text.Trim(); // Example textbox input


    HtmlToPdf converter = new HtmlToPdf();

    converter.Document.PageSize = PdfPageSize.A4;

    converter.Document.WebPageWidth = 1024;


    byte[] pdfBytes = converter.ConvertUrlToMemory(targetUrl);


    Response.Clear();

    Response.ContentType = "application/pdf";

    Response.AddHeader("Content-Disposition", "attachment; filename=site.pdf");

    Response.BinaryWrite(pdfBytes);

    Response.End();

}



object sender The control that raised the event (e.g., the clicked button)

EventArgs e Extra data about the event (empty in basic events)


commonly used in modern ASP.NET Core apps more examples using fetch + Blob in JavaScript to download files PDF, Excel

 Let’s explore real-world examples using the fetch + blob pattern, targeting different file types and scenarios.


✅ 1. Download Static PDF via Fetch + Blob

🔹 JavaScript

<button onclick="downloadStaticPdf()">Download PDF</button> <script> function downloadStaticPdf() { fetch('/files/sample.pdf') // Static file from wwwroot/files/ .then(res => res.blob()) .then(blob => { const url = URL.createObjectURL(blob); const a = document.createElement('a'); a.href = url; a.download = 'sample.pdf'; document.body.appendChild(a); a.click(); a.remove(); URL.revokeObjectURL(url); }) .catch(err => console.error('Download failed', err)); } </script>

✅ Good for static files in wwwroot.


✅ 2. Download File from Controller Action (e.g. Generated on the Fly)

🔹 C# Controller (.cs file)

[HttpGet] public IActionResult DownloadDynamicPdf() { // Simulate dynamic PDF generation var filePath = Path.Combine(Directory.GetCurrentDirectory(), "wwwroot/files/sample.pdf"); var fileBytes = System.IO.File.ReadAllBytes(filePath); return File(fileBytes, "application/pdf", "generated-sample.pdf"); }

🔹 JavaScript

<button onclick="downloadDynamicPdf()">Download PDF</button> <script> function downloadDynamicPdf() { fetch('/Home/DownloadDynamicPdf') .then(res => res.blob()) .then(blob => { const url = URL.createObjectURL(blob); const a = document.createElement('a'); a.href = url; a.download = 'generated-sample.pdf'; document.body.appendChild(a); a.click(); a.remove(); URL.revokeObjectURL(url); }); } </script>

✅ Best for on-demand file generation or secure server-side files.


✅ 3. Download Excel File from Server (e.g., Report)

🔹 Controller

[HttpGet] public IActionResult DownloadExcel() { var path = Path.Combine(Directory.GetCurrentDirectory(), "wwwroot/files/report.xlsx"); var fileBytes = System.IO.File.ReadAllBytes(path); return File(fileBytes, "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet", "report.xlsx"); }

🔹 JavaScript

<button onclick="downloadExcel()">Download Excel</button> <script> function downloadExcel() { fetch('/Home/DownloadExcel') .then(res => res.blob()) .then(blob => { const url = URL.createObjectURL(blob); const a = document.createElement('a'); a.href = url; a.download = 'report.xlsx'; document.body.appendChild(a); a.click(); a.remove(); URL.revokeObjectURL(url); }); } </script>

✅ Works for any binary file — Excel, Word, ZIP, images, etc.


✅ 4. Download File After Form Submission (POST)

Useful when you're sending data (e.g., form, filters) to generate the file first.

🔹 Controller

[HttpPost] public IActionResult GeneratePdf([FromBody] ReportRequest request) { // Simulate PDF generation var filePath = Path.Combine(Directory.GetCurrentDirectory(), "wwwroot/files/sample.pdf"); var fileBytes = System.IO.File.ReadAllBytes(filePath); return File(fileBytes, "application/pdf", "filtered-report.pdf"); } public class ReportRequest { public string Filter { get; set; } }

🔹 JavaScript

<button onclick="downloadFilteredPdf()">Download Filtered PDF</button> <script> function downloadFilteredPdf() { fetch('/Home/GeneratePdf', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ filter: "LastMonth" }) }) .then(res => res.blob()) .then(blob => { const url = URL.createObjectURL(blob); const a = document.createElement('a'); a.href = url; a.download = 'filtered-report.pdf'; document.body.appendChild(a); a.click(); a.remove(); URL.revokeObjectURL(url); }); } </script>

✅ Ideal for dynamic downloads based on user input.


✅ 5. Handle Filename from Response Header

If your controller sets the filename dynamically:

🔹 Controller

[HttpGet] public IActionResult DownloadWithFilename() { var path = Path.Combine(Directory.GetCurrentDirectory(), "wwwroot/files/sample.pdf"); var bytes = System.IO.File.ReadAllBytes(path); Response.Headers.Add("Content-Disposition", "attachment; filename=\"custom-name.pdf\""); return File(bytes, "application/pdf"); }

🔹 JavaScript with filename parsing (advanced)

function downloadWithHeaderName() { fetch('/Home/DownloadWithFilename') .then(response => { const contentDisposition = response.headers.get('Content-Disposition'); let fileName = "download.pdf"; if (contentDisposition) { const match = contentDisposition.match(/filename="(.+)"/); if (match.length > 1) fileName = match[1]; } return response.blob().then(blob => ({ blob, fileName })); }) .then(({ blob, fileName }) => { const url = URL.createObjectURL(blob); const a = document.createElement('a'); a.href = url; a.download = fileName; a.click(); URL.revokeObjectURL(url); }); }

✅ Professional way if filename changes on server side.


🛡️ Tips for Production

  • Add error handling: .catch() in JS and try/catch in controller.

  • Ensure content types are correct (e.g., "application/pdf", "application/zip", etc.).

  • Use [Authorize] on secure download actions.

  • Don't allow downloading files directly from sensitive paths.

SQL Server auto generate UNIQUEIDENTIFIER

 Insert data into a table ✅ Have SQL Server auto-generate both: Id (as INT IDENTITY ) RowGuid (as UNIQUEIDENTIFIER , using NEWID()...

Best for you