0 Comments

Baja Consulting is now live on SharePoint 2013 on Office 365. After waiting months for our old tenant to be upgraded, we decided to simply cancel the current subscription and sign-up again. Because of that, everything is brand new (including blog), so we apologize for anyone that got a bad link from a search engine.

For anyone that chooses this approach also, the only major thing that we ran into was Lync Federation.   Others on Office 365 federated immediately (or pretty close), then after about 12 hours customer federation started working and finally Skype and MS Messenger contacts started working after about 36 hours.

Cheers!

0 Comments

Site Tools


Provides designers and developers the ability review the current web site and lists properties on that site. Provides the additional ability to build queries against those lists and render the JavaScript for selecting that information either utilizing CSOM / CAML or REST / OData.​

Open in Store »

 

Version 1.0.1.1 - 9/20/2013

What's New:

  • Changed List Query Builder by adding a dropdown to select the type of generated code and removed the corresponding buttons.
  • Added the ability to generate Knockout from the List Query Builder.  Designed as a Data View replacement to allow designers to easily select data from lists and simply modify the generated HTML to render views.

main

 

List Query Builder

 
Allows the user to view and build queries based on the current web's lists.  Then optionally generate the JavaScript needed to retrieve the data either using CSOM or REST.  The generated code can then be copied into a Script Editor Web Part or Content Editor Web Part to give new developers and site designers an easy jump start into the new SharePoint development methodology.

 

lists2

 

Web Properties

 
Displays all of the available properties that are returned from the REST / OData call for the current web.

 

web

 

Kn​​own Issues

  • The pages in the application use the SharePoint Client Object Model and the OData interface to dynamically build the information.  Some of the fields and pages that are dynamically called either return errors or return nothing.  These fields and pages exist but require alternative access methods. Currently, this is by design.

 

Browser Compatibility​

The generated code has been tested on several older AJAX enabled browsers but should be tested to ensure that it meets all of your requirements.  The App requires IE9 or greater and has been tested on multiple other modern browsers.​

Privacy ​P​olicy

 
This application reads web and list properties but does not transmit anything to any outside site.  All generated code should be reviewed prior to placing on your site.  Not guarantees of any kind are made regarding privacy of the data.​

0 Comments

During one of the service packs to SharePoint 2007 the ability to run anonymous workflows was turned off and to the best of my knowledge a good solution to the issue was never found (even for SharePoint 2010).  I tried solutions such as http://wssguestaccount.codeplex.com/ and other like this but had run into constant issues with the entire user session being run as an impersonated user.  Clients kept asking me for a solution and after searching around with no luck I decided it was time to write a little code.

Enter the SharePoint 2010 "Site Level" event receiver. While workflows require a logged in user, the SPItemEventReceiver is not quite as picky and they run anytime an item is added or updated anywhere on our site (SPWeb). So how does this help you might ask? Well the basic idea is if a new item is created or an item is modified, we can find the workflows associated with that item a launch them. Here is the code I use to do just that: 


private void _checkWorkflows(SPListItem li, Event​Type et)​
{
   SPSecurity.RunWithElevatedPrivileges(delegate()
   {
       SPUserToken token = null;​

       using (SPSite site = new SPSite(li.ParentList.ParentWeb.Site.ID))
       {
           using (SPWeb web = site.RootWeb)
           {
               try
               {
                   string usr = web.AllProperties.ContainsKey(AnonWorkflowReceiverSettings.AnonWorkflowReceiverSettingsRunAs) ? web.AllProperties[AnonWorkflowReceiverSettings.AnonWorkflowReceiverSettingsRunAs] as string : "SHAREPOINT\\System";
                   token = web.GetUserToken(usr);
               }
               catch
               {
                   token = site.SystemAccount.UserToken;
               }
           }
       }

       using (SPSite site = new SPSite(li.ParentList.ParentWeb.Site.ID, token))
       {

           using (SPWeb web = site.OpenWeb(li.ParentList.ParentWeb.ID))
           {
               SPList list = web.Lists[li.ParentList.ID];
               SPListItem listItem = list.Items.GetItemById(li.ID);
               foreach (SPWorkflowAssociation wf in list.WorkflowAssociations)
               {
                   if ((et == EventType.Added && wf.AutoStartCreate) || (et == EventType.Updated && wf.AutoStartChange))
                   {
                       foreach (SPWorkflow runningWF in listItem.Workflows)
                       {
                           if (runningWF.Author == -1) // Kill all running WF that are anonymous since they will fail anyway
                           {
                               SPWorkflowManager.CancelWorkflow(runningWF);
                           }
                       }

                       site.WorkflowManager.StartWorkflow(listItem, wf, string.Empty, SPWorkflowRunOptions.Synchronous); // Works Async too
                   }
               }

           }
       }
   });
}

private enum EventType
{
   Added,
   Updated
}
 


 

 

You might notice that we have to run the workflow as a specific user and to do that I store the user name in the root web's property bag. I do default to the system account if no value is in the property bag but I suggest no running your workflows this way. The rest is just setting up your event receiver and adding a way for you to assign the user to run the workflow as. To do that I suggest you check out Creating SharePoint 2010 Event Receivers in Visual Studio 2010 and I added a page to the _layouts directory and add a custom link to it in Site Settings using a Custom Action.

 

So far the solution is working great for anonymous forms and other items that we allow users to submit on public web sites. As always though, please test before using this code in production or drop me a line if you need any consulting help.

 

I hope this helps someone.

 

Cheers,

James

0 Comments

Recently while working with a client, a requirement for global navigation in their new SharePoint 2010 farm came up.  They have approximately 10 web applications with a minimum of 10 site collections per web application, so manually maintaining the navigation at each site collection was out of the question.  I therefore started to look at the usual suspects for global nav.  I went over the SiteMapDataSource, SharePoint 2010 Navigation Menu, and multiple other custom solutions with them.  Each one had its benefits and drawbacks but we had just about settled on a robust custom solution when an idea came to me while watching a St. Louis Cardinals baseball game (not important to the solution).  What about the Managed Metadata Service?  We were only going to have one service for the whole farm and pretty much as long as the farm is up, so is the service (i.e. not dependent on another web application).  So I started looking…

There is a label that could be used as the display name, a large text box for description that could be used for the URL and built in custom sorting and if needed could support multiple languages in the future.  Once I determined that, it was just a matter of some custom code and POC testing.

I started by setting up a Group, Term Set and the nested Terms that would make up my navigation.  Assigning each Term a “Default Label”, populating the “Description” if I want it to be a link or leaving it blank if it was just a container and unchecking “Available for Tagging” for Terms but keeping it checked for the Term Set (more on that later):

TermStore

 


Next enter Visual Studio:

 
The Microsoft.SharePoint.Taxonomy namespace / assembly has the classes we need to pull our Managed Metadata and is fairly easy to use.  We start by getting the Terms we are looking for from the service:

TaxonomySession session = new TaxonomySession(SPContext.Current.Site);

var termStore = session.DefaultSiteCollectionTermStore;
var tsc = termStore.GetTermSets("GlobalNavigation", 1033);

if (tsc.Count > 0)
{
     var tsGlobal = tsc[0];
     var tc = tsGlobal.GetAllTerms();
           
     var items = _buildMenu(tc, Guid.Empty,
                    tsGlobal.CustomSortOrder == null ? new List() : tsGlobal.CustomSortOrder.Split(':').ToList());

     _renderMenu(items);
}


 
Notice I did have to hardcode the name of my Term Set “GlobalNavigation” but everything else is pretty generic.  I am assuming 1 Term Set in the entire service named GlobalNavigation but you could also narrow by the Group first to make sure.  Also notice the strange parameter call to my _buildMenu function.  When custom sorting is used the items are not returned sorted and there is not a property on a Term with its sort order, but instead the parent container (TermSet in this case) has a property called CustomSortOrder that is a string of guids separated by semicolons.  Therefore I split the string into a list correctly sorted for use in the function.

 
To build the structure of my navigation I created a private class to maintain the information:

 

private class _menu
{
    public Term item {get; set;}
    public List<_menu> children { get; set; }
    public int order {get; set;}

    public _menu(Term i)
    {
        item = i;
        order = 0;
    }
}

 
Then it was just a matter of parsing the terms and putting them in the right order:

 

private List<_menu> _buildMenu(TermCollection terms, Guid parent, List sortOrder)
{
    var ret = new List<_menu>();

    IEnumerable childTerms = null;

    if (parent != Guid.Empty)
    {
        childTerms = from k in terms
               where k.Parent != null && k.Parent.Id == parent && !k.IsDeprecated
               orderby k.CustomSortOrder
               select k;
    }
    else
    {
        childTerms = from k in terms
               where k.Parent == null && !k.IsDeprecated
               orderby k.CustomSortOrder
               select k;
    }

    foreach (Term child in childTerms)
    {
        var newItem = new _menu(child);
        if (sortOrder != null && sortOrder.Count > 0)
            newItem.order = sortOrder.IndexOf(child.Id.ToString());

        //Find this items sub terms - Recursion Rocks!
        newItem.children = _buildMenu(terms, child.Id,
            newItem.item.CustomSortOrder == null ? new List() : newItem.item.CustomSortOrder.Split(':').ToList());

        ret.Add(newItem);
    }
   
    return (from r in ret
                orderby r.order, r.item.Name
                select r).ToList<_menu>();
}

 
Once you get the List of _menu items you just have to use your favorite menu rendering technique, apply a little CSS, add your new control to your master page solution (or hack with SPD) and you have a menu based on your Term Set:

 Menu
 


Advantages:

  • “Term store management” is available to the contributors in each site collection as long as they can get to site settings
  • The Term Store is available across the entire farm without being dependent on a web application
  • Seems very fast (so far)but add caching to your control just to be sure

Quirks / Disadvantages:

  • You must make the Term Set “Available for Tagging” so everyone can see it.  This might confuse people if they go to use the Terms in a normal way and see our Term Set.  However, since we unchecked the tagging option for each Term, they really can’t do anything with the Term Set.
  • To get the custom sorting to work properly, every time you add a new Term you must change the sort order, save it and then change it back to get the CustomSortOrder to populate correctly.
  • This does not security trim but that could be added.
  • Requires custom code

Summary:
While I just proved this out this week and have not implemented in production yet, I have tested on multiple web applications and site collections including anonymous and so far so good.  The client flipped over the maintainability and “fool proof” nature of this solution but I am still searching for holes.  If you have any questions or comments, especially any inherent flaw, please let me know.

As always, I am a consult available to help implement / customize the full solution.  However, if enough people find it useful I would be happy to create a Codeplex solution.  This information is for discussion and education purposes only and is in no way guaranteed.

 
Cheers,
James

0 Comments

I know I promised this post a long time ago but thankfully I have been very busy.

img1

On the surface the HTML web part available on the public facing website appears to be a Swiss army knife for the site. If you can insert any HTML into the page then you can do just about anything. But… on closer examination you find that the HTML you insert into the part is placed in an Iframe and all script and content is loaded with the setTimeout JavaScript function. Currently it does appear that the base theme CSS is inserted into the Iframe but any custom CSS is not being applied. This is easily worked around if you either copy and paste your same CSS or use the same import statement I discussed in my previous post.

The reason I started looking at the HTML part was to insert JQuery and my own script into the page. Being asynchronously loaded into an Iframe meant no real help when it came to modifying elements on my page. So I started searching elsewhere. I then stumbled on the Site Options on the Site ribbon in SPD.

 

To Add JQuery:

  • Open SPD -> Site ribbon -> Click Site Options
  • I chose to edit wh_footertext

img2 

  • Added:
    <script src="http://www.google.com/jsapi" type="text/javascript"></script> <script src="/Site%20Assets/startup.js" type="text/javascript"></script>
    
    
     

 

These values are really there for adding consistent information throughout the site but work very well for including anything as it does not appear to be validated in any way.