About SharePoint with 16+ Cores

Well, If you update .Net 4.7.2 or higer it is OK, otherwise it is a bad idea.

More speficially,
ReaderWriterLockSlim with reentrant have design limitation which can lead serious performance down for previous .Net versions. And Sharepoint pretty much depended on this thread syncronization object. Specially Blob Cache and ObjectCache wraps and using them. More CPU causes more thread contention, excesive locking thats brings slowness.

Example callstacks:
SPReaderWriterLock named [BlobCache] waited 43992 milliseconds to acquire lock. Call stack:
at Microsoft.Office.Server.Utilities.SPReaderWriterLock.AcquireLock(Boolean readerLock, Boolean upgradable, Boolean throwException)
at Microsoft.Office.Server.Utilities.SPReaderWriterLock.AcquireLock(Boolean readerLock, Boolean upgradable)

System_Core_ni!System.Threading.ReaderWriterLockSlim.EnterMyLockSpin()
System_Core_ni!System.Threading.ReaderWriterLockSlim.TryEnterWriteLock(Int32)
Microsoft_Office_Server!Microsoft.Office.Server.Utilities.SPReaderWriterLock.AcquireLock(Boolean, Boolean)
Microsoft_Office_Server!Microsoft.Office.Server.ObjectCache.SPCache+MossObjectCache.UpdateUsageMap(System.String, UInt32, UInt32)
Here are some threads about the problem.
https://github.com/dotnet/coreclr/pull/13243
https://github.com/dotnet/coreclr/pull/13495

These issues has been fixed with .Net Framework 4.7.2 version!.

Pls check .NET Framework 4.7.2 Release Notes
https://github.com/Microsoft/dotnet/blob/master/releases/net472/dotnet472-changes.md

Again don’t think about 64cpus makes more performance. It depends of the software boundaries/limitations and several other things..
My suggestion, scale up with multiple machines. It is more cheaper and stable. ! Not with excessive hardwares. (8 cores are fine 🙂 )

If you have that monster machines, don’t worry you can  use it with HyperV and scale up VMs.

Advertisements

Microsoft changes .NET Framework Support Lifecycle Policy

On August 7, 2014, Microsoft announced that support will end for .NET Framework 4, 4.5, and 4.5.1 on January 12, 2016. We recommend customers and developers complete the in-place update to .NET Framework 4.5.2 by January 12, 2016 to continue receiving technical support and security updates. Support for .NET Framework 4.5.2, as well as all other .NET Framework versions such as 3.5 SP1, will continue to be supported for the duration of the operating system support lifecycle.

http://support.microsoft.com/gp/Framework_FAQ

Prevent caching for specific files in SharePoint

Assume following scenario that you have a SharePoint environment and all cachings are enabled . Caching is very efficent for documents which are not change frequently . What if you have dynamically and frequently changing xml,js,txt or a css file . so how could you provide to bypass only needed files are not being cached and other leave other same type files be cached.

There is an ASP.NET trick for this purpose. Assume that we have a menu.xml and this file is updating by some Operations on server and this operation happens very frequently. But your clients could not get updated file until they completely clear their browser caches. You dont want to be prevent every xml files to be not cached .
Basically caching machanizms has deciding to cache files by checking url syntax. If you have change the url somehow you can prevent the cache. We usually provide this by adding a fake version query string like menu.xml?ver=1 at the end of the url . and increasing this parameter when ever we changed the file. For some examples you can aslo use GUIDs for this.

1)Create a test.js file and type following codes. You may also integrate this to your own page directly.

function CreateDigits() {
return (((1 + Math.random()) * 0x10000) | 0).toString(16).substring(1); }
function createGuid() {     return (CreateDigits() + CreateDigits() + “-” + CreateDigits() + “-4” + CreateDigits().substr(0, 3) + “-” + CreateDigits() + “-” + CreateDigits() + CreateDigits() + CreateDigits()).toLowerCase(); } //alert(createGuid());

$.get(‘/_layouts/menuxml/menu.xml’ + ‘?ver=’ + createGuid() , function (data) {     //do operations.     alert(‘Load was performed.’); });

2) Add test.js to your masterpage
<script type=”text/javescript” src=”/_layouts/test.js” ></script>

3) Use IE developer Panel (F12) or  Firefox Firebug plugin or Fiddler2 program on your client  check that menu2.xml is loading when you request the file.

You should see the menu.xml file request has contain ?ver=<GUID> notation.

4) Update your Javascript files where loading  XML file according to example.

5) And test the change xml file and check still browsers caching the file

What we basically do ?

Following codes has creates a GUID function CreateDigits() While we requesting the xml file we adding end of the url of the file this GUID as a parameter for every request it will creates another guid. Like http://<site>/_layouts/menuxml/menu.xml?ver=bca7f319-5627-4d22-48f6-4c3e59285199

So it will provide that every request on client is unique and force the clients update XML file for every time. This is the basic logic you can develop yours based on this.

Resources:
http://guid.us/GUID/JavaScript