Blog Home  Home Feed your aggregator (RSS 2.0)  
Mayur's Blog - Sunday, July 12, 2015
 
# Sunday, July 12, 2015

It is due to renaming of GlobalImport.cshtml to _ViewImports.cshtml.

You may rename GlobalImport.cshtml available under Views folder to _ViewImports.cshtml as shown below.

You may refer this Announcement for the details. https://github.com/aspnet/Mvc/issues/2489

Sunday, July 12, 2015 5:52:56 PM UTC  #       | 
# Tuesday, June 16, 2015

Looks like that we have resolved another configuration related issue on Azure. In case someone is facing the same problem without any clue to fix, then read the following.

Our issue was: Asp.NET web site running very well and faster on on-premise server did not run very well and slower, when migrated to Azure VM. The same VM also hosted SQL Server for this web site.

1) Presently each VM with one data disk has 500 IOPS limit.

2) It is likely that installing SQL Server and actively using the instance of SQL Server may break this limit. Azure calls it throttling of IOPS. If you have set up monitoring as suggested in one for replies above you may be able to verify that. However, as per our experience, light to moderate usage of SQL Server (with only one data disk and VM not being configured properly) has capacity to throttle IO limits. So that has to be taken seriously.

The fix this issue you need to increase the IOPS limit. That can be achieved by adding more data disks to your virtual machine. Each disk comes with 500 IOPS capacity.

3) Additionally VM has to be configured correctly to install SQL Server to avoid these types of bottlenecks. Merely adding disks would not solve it. You may do the followings

a) Add maximum number of data disk with 1 TB size (as per current offerings each VM, standard A2 size, can have 4 data disk with 1 TB capacity.). Remember that Azure will charge only the portion you use on the disks even though you have attached all of them

b) Stripe all 4 together with a new storage pool. That can be done using Server Manager - Files and Storage Services tab. However, you can not really use this to further configure it. At this moment there is no GUI available. You will have to use PowerShell to configure it. Configuration involves specifying correct values of columns and interleave properties. You may specify 4 columns and 64 kb interleave

c) Once this is done you may create required virtual drives on this storage pool. You may provide allocation size of 64 kb while formatting the drive used for SQL Server data directory.

4) Beside this you may also consider storing your TembDB on D drive on Azure VM. There is no IOPS cap on D drive. However, this is a temporary drive. Each time VM is rebooted it gets cleared.

If you have already set up your VM and using it, then easier solution (if possible) is to create another VM and de-allocate the existing one, once you have migrated everything.

If you do not want to de-allocate the existing VM and you don't have any data disk attached, then it can be done by adding data disk and configuring storage pools. However, if you have attached data disk already and using them then it is hard. I really do not know how to do. I think storage pools cannot be created for data disks already being used as one of physical drives.

In my case I set up entirely new VM and de-allocated the existing one after the migration. Presently, the site is running way faster, the way it should be on the cloud.

Note: Why did the same site behaved correctly initially when migrated to Azure and ran into issues afterwards? (It took around 3 to 4 weeks before the site ran into issues)

As per support person, it was possible that when we configured our VM initially, the rack space may not have anyone except us. So Azure allowed to move our IOPS and did not care much. As more people allocated resources to the same rack space, Azure started balancing the activities to allocate fair usage to others too. Throttling was not allowed and penalties were imposed in terms of performance.

Tuesday, June 16, 2015 8:44:47 PM UTC  #       | 
# Thursday, May 14, 2015

Recently I was working on CRM Dynamics migration where the system was upgraded to the latest version for CRM Dynamics Online. CRM should be using claims based authentication as I was trying to authenticate against my Office 365 account.

I had latest 2015 SDK and trying to utilize early bound approach to integrate in-house applications to CRM Dynamics Online.

Things were okay until I found that the application was throwing an error while saving the context. Ironically, the application was able to generate service proxy with provided Id and Password. It was failing only when I tried saving the context.

I started looking around in SDK where there are many samples given. I figured that unless I specify correct Device credentials the system continued rejecting the request due authentication error mentioned above.

I put in the following code to generate proxy by referring SDK

  IServiceManagement<IOrganizationService> orgServiceManagement =

                        ServiceConfigurationFactory.

                        CreateManagement<IOrganizationService>(crmConnection.ServiceUri);

                    var clientCredentials = crmConnection.ClientCredentials;

                    var authCredentials = new AuthenticationCredentials();

                    authCredentials.ClientCredentials = clientCredentials;

                    authCredentials.SupportingCredentials = new AuthenticationCredentials();

                    authCredentials.SupportingCredentials.ClientCredentials = Microsoft.Crm.Services.Utility.DeviceIdManager.LoadOrRegisterDevice();                   

                    AuthenticationCredentials tokenCredentials = orgServiceManagement.Authenticate(authCredentials);

                    var organizationTokenResponse =

                        tokenCredentials.SecurityTokenResponse;

 

                    return proxy = new OrganizationServiceProxy(orgServiceManagement, organizationTokenResponse); 

It should be noted that this code base should use the connection string specified in "developer resource" tab available when you log in to your CRM account.

If you have set up something similar mentioned above and facing similar issues then you should try using device credentials.

Thursday, May 14, 2015 9:23:34 PM UTC  #       | 
# Sunday, April 13, 2014

In case if you are still not aware then it is good to know that SAP Crystal Report is now supporting Visual Studio 2013. You may be able to download it from here.

http://scn.sap.com/docs/DOC-7824 

Please make sure that you run Exe to integrate it with VS. Running MSI should not do it properly. MSI installs runtime but not integrate it with VS.

Sunday, April 13, 2014 6:22:28 PM UTC  #       | 
# Thursday, November 15, 2012

I am putting together a code snnipet which you can use to create an extended method for any controller in Asp.Net MVC. This method can look into the views, add in model to the views and send a btye stream with application content of PDF type. As all modern browser can understand this type of byte stream, these browsers are able to read the byte stream and show the PDF in browser window.

In your base controller add in the following method. You inherit other controllers from this base controller

public PDFContentResult RenderPDFUsingHTML(string viewName, object model)

{

StringWriter sw = new StringWriter();

RazorViewEngine rv = new RazorViewEngine();

ViewEngineResult vr = rv.FindPartialView(this.ControllerContext, viewName, false);

ViewData.Model = model;

ViewContext vc = new ViewContext(this.ControllerContext, vr.View, ViewData, TempData, sw);

vr.View.Render(vc, sw);

string s = sw.GetStringBuilder().ToString();

/*I am using iTextSharpt to convert raw html into PDF. The following GetHTMLToPDFBytes(s) is implementation of itextSharp. You can also use anyother utility to convet html string into PDF.*/

byte[] pdfByes = GetHTMLToPDFBytes(s);

/*PDFContentResult is a custom ActionResult type. See the following code snippet for the details*/

return new PDFContentResult(pdfByes, "application/pdf");

}

/*---------------------------------------------------------------*/

public class PDFContentResult : ActionResult

{

private readonly string contentType;

private readonly byte[] contentBytes;

public PDFContentResult(byte[] contentBytes, string contentType)

{

this.contentBytes = contentBytes;

this.contentType = contentType;

}

public override void ExecuteResult(ControllerContext context)

{

var response = context.HttpContext.Response;

response.Clear();

response.Cache.SetCacheability(HttpCacheability.Public);

response.ContentType = this.contentType;

using (var stream = new MemoryStream(this.contentBytes))

{

stream.WriteTo(response.OutputStream);

stream.Flush();

}

}

}

Code snippet is self explantory. RenderPDFUsingHTML takes name of view and model as argumets of the method signature. It tries to find out required RazorView based on the given view name, once it is found it inserts the model data to it. Once this is done, it converts the view (raw html) to string, sends this string to iTextSharp to build byte array of PDF.

To make things sipmpler it returns PDFContentResult which is cutom ActionResult type. This is responsible to create a required http response with byte array added to it.

You do the following in your controller to get PDF. Make sure that you first inherit your controller with base controller where you put in the above code snippet. 

public ActionResult GetMyPDF(MyModel model)

{

..... do coding ...... for controller ........

return RenderPDFUsingHTML("MyViewName",model)

}

It is good practice to create another view, which is a simplified version of the original view. Generally the views, which we use for web sites are little complicated as it holds html control such as buttons, dropdown menus, and javaScripts etc. We really do not show them in PDF reports. So the simplified veiw takes the same model but removes not needed javaScripts and other html controls.

Thursday, November 15, 2012 6:07:46 PM UTC  #       | 
# Wednesday, October 31, 2012

Today I wondered why the values inserted by jQuery date picker were not properly parsed by default ASP.NET MVC model binder. I discovered that date picker was following US format (mm/dd/yyyy); however, model binder was following the UK format (dd/mm/yyyy).

I thought to override the model binder for dateTime type. Then I thought it may be too much work and there has to be some easier way. I started looking into ASP.NET MVC web.config structure and found that I could specify the current culture.

I added the following <globalization> element to the <system.web></system.web>

<globalization

requestEncoding="utf-8"

responseEncoding="utf-8"

culture="en-US"

uiCulture="en-US"/>

It solved my problem and model binder was able to parse the date picker's value and model was properly filled in with required dateTime values.

Wednesday, October 31, 2012 10:44:23 PM UTC  #       | 
# Wednesday, September 28, 2011

Nice article regarding HttpContext. Very useful if you are looking forward to know where to use Application_BeginRequest event available in the Global file

protected void Application_BeginRequest(Object sender, EventArgs e)

Refer this link: http://odetocode.com/articles/111.aspx

Happy Coding!

Wednesday, September 28, 2011 2:31:47 PM UTC  #       | 
# Saturday, September 24, 2011

If you every wonder about the VIEWSTATE especially in terms of its security, then this may be a good article

http://aspnetresources.com/articles/ViewState

Please have a look. I'm sure there is a lot to learn from here.

Saturday, September 24, 2011 2:16:23 PM UTC  #       | 
# Tuesday, August 23, 2011

I converted my web application from .NET 3.5 to .NET 4.0 and ported it to IIS 7.0. All of a sudden the AutoComplete Extender stopped working.

After some research I found that my web.config file needed the followings:

<system.webServer>

<handlers>

<add name="ScriptHandlerFactory" verb="*" path="*.asmx" preCondition="integratedMode" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"/>

<add name="ScriptResource" verb="GET,HEAD" path="ScriptResource.axd" type="System.Web.Handlers.ScriptResourceHandler, System.Web.Extensions, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/>

</handlers>

</system.webServer>

I added these handlers and the AutoComplete Extneder was back to work.

Hope this may help others to solve the similar problems.

Happy coding to everyone !!

Tuesday, August 23, 2011 7:00:46 AM UTC  #       | 
# Monday, August 31, 2009

Many times we need to refresh/reload a parent window with respect to events associated on child window. For example,

1. Parent web page shows values from database.

2. An end user presses a button on the parent window. It pops up a new child windows; where the end user enters new values for database records

3. The end user presses submit button on child window. It processes code behind and updates database and closes child window after showing a relavant message.

In this situation it is a good idea to refresh the parent web page as soon as the child page is closed. This way the data on parent web page are get refreshed and the end user can get confirmation of recent chages.

The following small javaScript code will help you to achieve that.

<script type ="text/javascript" >

function refreshAPage() {

{

window.opener.location.href = window.opener.location.href;

//OR window.opener.location.href="URL";

}

</script>

Monday, August 31, 2009 11:14:59 PM UTC  #       | 
Copyright © 2022 Mayur Bharodia. All rights reserved.
DasBlog 'Portal' theme by Johnny Hughes.
Pick a theme: