Quantcast
Viewing all 63 articles
Browse latest View live

AEM Architect Certification

Recently I took AEM Architect certification that really helped me a lot to explore AEM (specially integration with other Adobe and non Adobe products). More than recognition the certification will help you to boost your confidence and knowledge that you can use while designing solution and working on an AEM projects.

I started working on AEM (when it was called CQ and at that time 5.4 was the latest version) as a developer and delivered multiple projects in various domains. During my early days I was very much focused on core build blocks of AEM e.g. OSGi, Sling, JCR etc. to understand APIs and that has really boosted my capability to work on AEM. After working on AEM for few years I thought it is good time to go for AEM architect certification because anyways my job role at work place is AEM Architect and I have learned a lot at my job.

In this post I want to share my experience with those who want to become AEM architect. This exam covers every aspect of AEM right from installation to designing custom solution on top of AEM so, if you do not have few years of AEM experience then I’ll highly recommend to spend at least 2 to 3 years with AEM.

When I started to think about certification I was very much sure that I do have practical hands on knowledge but, very soon I realized that it is not enough for architect certification. Along with hands on knowledge we should know best practices, how to secure AEM, how to customize AEM, what is provided out of box and to be frank you should know everything about AEM. The purpose of this exam is to make sure that an architect is able to design a solution using AEM capabilities that should suffice client/customers requirements. In this exam you’ll not be asked to write code or remember code syntax but, from functional point of view it is expected that you know all capabilities of AEM.

So what to study and where to find information to get started?

As per Adobe’s certification mandate everyone need to sign a NDA and no one is supposed to disclose the questions asked during certification so, I won’t disclose that here. What I can tell you in this article is what topics you should study and where to find study resources.

There are no short cuts. For this exam you have to prepare well along with your real experience. If you have good hands on experience of AEM and do not have much time to go through documentation then I’ll recommend to attend a training offered by Adobe (http://training.adobe.com/training/current-courses.html#solution=adobeExperienceManager&p=1&country=United-States). I did not attended any training because I was already very much familiar with documentation as I was referring it frequently during my day to day work.

Two important points to remember while preparing for certification (I did this):
1) While going through the documentation please remember that it is always beneficial to try out things explained in documentation yourself on your local instance of AEM, this will not only help you to understand the concept but, you’ll remember it better because you did it yourself.
2) Things which are new to you or you are not using often needs more attention. Try to repeat it by trying various combinations to see what happens when you change something.

Here are some of topics that you should master before you appear for exam. Exam guide also has a detailed list of topics that you can refer to.
1) Backup strategies,
2) Sling: EventListner (JCR), EventHandler (AEM) & Schedulers
3) Query: JCR Queries, QueryBuilder, Predicated. When to use which option.
4) Sightly
5) David's content Model
6) Architectural diagrams (Conceptual, Physical, Data Flow, Logical etc.). which diagram is used when and which document you’ll prepare based on customer’s request.
7) AEM Security Checklist: https://docs.adobe.com/docs/en/aem/6-0/administer/security/security-checklist.html
8) Out of the box features & components
9) Translation and Internalizing of AEM components
10) Authentication handler, login module, integration with LDAP, SAML, Development of custom identity providers
11) Security – SAML, LDAP, Custom login module, Authentication handler, 2 factor authentication, clickjacking, CSRF, XSS, firewall rules, DOS attack, ACL groups / users etc.
12) MSM: Blueprint, triggers, actions. Developing custom rollout configurations. Best practices for site hierarchy, live copy, blueprint, live action, roll out config, language copy, translation workflow, etc.
13) Baseline for AEM infrastructure (machine capacity, disk space etc.): how many author/publish instances required based on number of users, size of repository and load on server.
14) Integration with: Adobe Campaign, Adobe Target (mbox.js), Adobe Analytic/Site Catalyst, Adobe Media Optimizer, Dynamic Tag Manager, PhoneGap
15) Importing content from external system – Explore various options such as poll importer, eCommerce product importing, POST servlet, Content packages (best for large content per adobe doc), live feed integration, etc.
16) Integration with external system such as eCommerce PIM, Offer creation system, etc.
17) Serving the web pages created by external system from AEM.
18) Dispatcher – dispatcher.any, permission sensitive caching, dispatcher cache invalidation, etc.
19) Dispatcher server selections – IIS / Apache and how in the context of phased migration to AEM.
20) Dispatcher (very important): you should be able to setup dispatcher on IIS, Apache and should be able to configure it properly based on project requirement. You should know every tag/element and purpose. You should also be ware of security checklist that needs to applied for dispatcher.
21) Deployment options for author / publish and impact on parameters such as performance and failover – TarMK (farm), MongoMK (cluster).
22) Clientlibs and how it helps in improving load time (performance).
23) Performance of author machines – Concurrent workflow, Limiting parallel jobs, Disable asset synchronization service, etc.
24) Identifying number of templates given the pages.
25) AEM forms, its deployment model and security
26) Multiple languages for dialog – Language nodes, translator, etc.
27) AEM translation framework.
28) Account management activities – Password reset, etc.
29) Tagging – Best way to model tag structure.
30) AEM replication. How replication works in various deployment models.
31) WCM Components – Geo location support, OOTB components, Column component, extending components, etc.
32) Ecommerce – product data importing , price information, ecommerce API, etc.
33) UGC / Communities – Moderation, forum support, messaging support, adobe social, social logins, etc.
34) Caching, CDN.

Some Useful links:
· OOTB Components: https://docs.adobe.com/docs/en/aem/6-1/author/page-authoring/default-components/components.html
· Best Practice: https://docs.adobe.com/docs/en/aem/6-0/develop/the-basics/dev-guidelines-bestpractices.html
· OSGi Configs: https://docs.adobe.com/docs/en/cq/5-6-1/deploying/osgi_configuration_settings.html
· https://docs.adobe.com/docs/en/aem/6-1/develop/components/i18n/translator.html
· http://docs.adobe.com/docs/en/aem/6-0/administer/sites/languages.html
· https://docs.adobe.com/docs/en/aem/6-0/develop/components/i18n.html
· http://docs.adobe.com/docs/en/cq/5-5/wcm/default_components.html#Column%20Control
· https://docs.adobe.com/docs/en/cq/5-6-1/deploying/performance.html
· Vanity Path: http://antonyh.co.uk/category/adobe-cq5/
· AEM Form: https://docs.adobe.com/docs/en/aem/6-1/author/page-authoring/default-components/components.html
· https://docs.adobe.com/docs/en/aem/6-0/develop/best-practices.html
· https://helpx.adobe.com/aem-forms/6/aem-forms-architecture-deployment.html
· http://docs.adobe.com/docs/en/aem/6-0/administer/sites/multi-site-manager/msm-sync.html#Synchronisation%20Actions
· http://blogs.adobe.com/experiencedelivers/
· CURL Commands:  (https://gist.github.com/sergeimuller/2916697)

I hope this will help you to learn more about AEM and prepare for your certification. All the best!!!

AEM, FORM Submission & Handling POST requests

In this article I’ll be specifically talking about how to handle POST request in AEM and what options we have.

Please note that Adobe has added more security around handling POST request from AEM 6.x onwards to prevent CSRF attack. If you don’t configure AEM correctly, POST requests will not work even after using any of methods explained in this article.  You can read more about security configurations and restriction at:

It is very common use cases to have external application POST some data to an AEM page (e.g. /content/en/home.html) or even submit a form (POST) to same URL. But, as you know that in AEM POST works differently and any POST call to AEM is intercepted by Sling’s POST servlet (org.apache.sling.servlets.post.impl.SlingPostServlet).

In some cases org.apache.sling.servlets.post.impl.SlingPostServlet is very useful when you actually want to perform CURD (create, update, read and delete) operation on JCR nodes. If you are interested in manipulating nodes in AEM and don’t want to deal with low level JCR session and access right etc. then look at some simple but, very powerful examples at below URL:


Let’s go back to our original topic and define a use case for this article. Let’s say we have a page /content/en/home.html and following are acceptance criteria:

1. GET request should render the original page as HTML (/content/en/home.html).
2. POST request to same page should also render the same page and should persist the original request object (request parameters etc.)
3. External application should be able to POST data to /content/en/home.html and we should be able to access it via request.getParameter(“WHATEVER_PARAM_NAME”)

Out of box first use case just works without any customizations. But, if you try to POST some thing to same URL you’ll not get expected result because Sling’s POST servlet will intercept the request and will think we are trying to perform some CURD operation but that is not what we want in this case.

Sling provides a very useful utility to check which script/servlet will be invoked and if there are multiple eligible servlets that can handle the request then what will be the order. On your author instance navigate to below URL and enter any URL and choose request method (GET, POST etc.) and click resolve. The result will show a list of servlets and their order.


By default the order of POST request processing is as follows:

1. com.adobe.granite.rest.impl.servlet.DefaultServlet (OptingServlet)
2. org.apache.sling.servlets.post.impl.SlingPostServlet
3. org.apache.sling.jcr.webdav.impl.servlets.SlingWebDavServlet

So, what options we have?

1. Create a Sling servlet
2. Add a POST.jsp at template level and update form action so that it points to jcr:content e.g. (/content/en/home/jcr:content.html).  We need to add jcr:content in URL because this is the node where we have sling:resourceType defined which will tells sling’s resolver where our POST.jsp is sitting that is capable of handling POST request. If we don’t include jcr:content in path then Sling’s POST servlet will handle the request.
3. Just include “.external”  (e.g.  (/content/en/home.external.html) selector to your POST request URL. Does this sound simple? Yes it is.

I’ll not explain first 2 options as you can find ample of examples for those 2 options on internet. I am going to talk about power of 3rdoption in this article. So how does this works?

In AEM every page (.html) has a primary type jcr:primaryType = cq:Page and there are set of scripts that are mapped with this primary type which intercepts request and renders a page differently based on factors like selector, type of request method (by default only GET is handled) etc. You can find these mapping scripts for cq:Page under the folder /libs/foundation/components/primary/cq/Page. One of script under this folder is external.POST.jsp and this is the script which is responsible for handling a POST request with “.external” selector. If you look at the implementation of external.POST.jsp you’ll notice that implementation is very simple, it just creates a wrapper around existing POST request and overrides getMethod() method of SlingHttpServletRequestWrapper class so that it returns GET and forward control page URL (by removing selector) with this wrapped request and response object.

Very simple but powerful technique when you want handle both GET and POST request coming to a page.

I hope this will help you!!! Feel free to comment, suggest correction or anything else that you like…

AEM & Schema Based JSON Editor

AEM is very powerful when it comes to delivering content via REST. REST is core of AEM and with Sling’s selectors, resource resolution etc. it becomes even more powerful. Also, now a days REST has become a standard for applications to exchange data in stateless manner. To exchange data most of the time we use JSON format.

You might have created component, templates and pages etc. and also, you may have used .jsonselector (which outputs JSON in predefined format) to get JOSN representation of page but, have you every ran into a use case where you really want to manage real JSON data in AEM in similar way as you manage other pages?

In this post I’ll walk you through how to manage JSON (static only) in AEM. So, lets define a use case and acceptance criteria.

Use Case & Acceptance Criteria
1. We want to create a JSON editor (GUI based) component in AEM using which we can create/manage JOSN data in AEM.
2. We should be able to download JSON as a file.
3. We should be able to import JSON files edited outside of AEM and once imported we should be able to edit it via same GUI interface.
4. JSON should be validated before saving in both cases (manually editing via GUI or import)
5. We should be able to define schema (similar to XSD for XMLs) against which JSON will be validated.
6. AEM component should be able to generate component’s UI based on schema automatically and we don’t have to create separate component for managing different JSON formats.

Here is final output that we’ll get as solution (schema based json-editor). Here is quick summary of screen shots:

1) First screen shots shows a schema (JSON based) which will be used for validating JSON that we’ll edit/import and generating a GUI automatically to edit JSON data.
2) Second screen shots shows a UI generated based on provided schema. It also shows JSON validation status (on top left corner as Green Label) based on provided schema.

3) Third screen shots shows actual JSON output that will be generated when edited from UI and stored in JCR repository.

FIG. 1: JSON Schema for generating UI and validating JSON


FIG. 2: JSON Editor generated from schema


FIG. 3: JSON View (Generated by adding JSON via editor shown above)


For creating this solution I have used following:

1.  JS library (json-editor), which allows us to define schema and generate GUI based on provided schema. You can read about library and documentation on github:

2. Sling’s POST servlet (org.apache.sling.servlets.post.impl.SlingPostServlet) which is very useful when you actually want to perform CURD (create, update, read and delete) operation on JCR nodes. If you are interested in manipulating nodes in AEM and don’t want to deal with low level JCR session and access right etc. then look at some simple but, very powerful examples at below URL:


3. Bootstrap and Font Awesome CSS and Icon Library

Also, you can download complete AEM CRX package, which contains code (component, template), client libraries and sample pages for this article at following location:


You’ll notice that code is very self-explanatory and I’ll not spend much time explaining code but if you have any questions or need help feel free to post a comment.

Developing Single Page Applications in AEM using AngularJS

In this article I am going to walk you through how you can develop extensible AngularJS application (to be specific templates and component) in AEM.

If you have fair understanding of AngularJS and AEM then it would be easy for you understand this tutorial else I’ll recommend to gain at least basic knowledge of both before you attempt to follow this article.

I have spent initial few years of my career to write pure server side application using Java/J2EE (Servlet, JSP, Spring, different ORMs etc.) with little bit of JavaScript (jQuery etc.) and I was enjoying it. From last few years I have also started using modern JavaScript framework like AngularJS very heavily and it is a great experience and I am enjoying it even more and I’ll definitely recommend to learn and leverage it if you can.

Let’s start with a quick overview of both AEM and AngularJS.

AngularJS
If you are in web development filed (specially front end) then I am sure you must have heard about AngularJS. It is a great frame works to build SPA - ingle page applications (remember this as you’ll need to refer this back while going through article) and lot things that can be achieved very easily with minimum JavaScript code. AngularJS brings some cool features to web application development, some of them are:

1. Two Way Data-Binding
2. Templates
3. MVC (Models, Views/Partials, Controllers) and Services
4. Dependency Injection - we can create services, factories and other libraries that can be injected as and when you need it.
5. Directives – create custom HTML elements, attributes for your application.

If you are new to AngularJS or want to know more about AngularJS (which I recommend) then feel free to look at AngularJS official documentation (very soon AngularJS 2 will become popular but, in this article I’ll be talking about AngularJS 1). Here are few links that you can refer:


AEM (Adobe Experience Manager)
AEM is a CMS developed and managed by Adobe (Adobe acquired it from communiqué few years back). Earlier it was popular as CQ/CQ5. AEM is built on top of some of very nice and solid frameworks in market:

1. OSGi/Felix – allows us to develop modular application by developing bundles, which exposes services/components that can be referenced from other bundles. Bundles are similar to JAR files but can be deployed/un-deployed at runtime.

2. JCR/Oak – if you are new to JCR then consider this as a database/repository where everything is saved.

3. Sling – it is a framework which allows to resolve incoming request to a resource on server side in RESTful manner. A resource can be a JCR Node, Script, Servlet or anything else that is stored in JCR repository.

4. Java/J2EE – I believe Java does not need any explanation :-)

AEM brings in lot of feature like:
1. Created reusable templates and components to design pages. This is the feature that will be our focus in this article.
2. Adobe Marketing Cloud Integrations
3. Content tagging and Meta data management
4. Search
5. Manage digital assets via DAM

You can read more about AEM features at:

So, what is special about using AngularJS in AEM?

By now you must have got basic understanding of AngularJS and AEM. If you remember 2 main features of each AngularJS and AEM (that I have highlighted above) you’ll notice that they talk about 2 different approaches of building web page.

·      AngularJS promotes/encourages to build single page application i.e. try to consolidate views, controllers and functionality as much as possible on one/same page so that your users don’t have refresh page and you can load views (based on route change). With help of $scope/$rootScope and controllers, models we achieve our functional goals. To build single page application one should know upfront about what functionality needs to be clubbed on that one page, what all are dependencies, what all controllers are needed and how (and which) scope variables, models will be used to share information across various controllers, views, directive etc.

·      AEM promotes to break your application in smaller reusable components and templates. Various components can be combined in variety of different ways to create web page. Components don’t know about each other and AEM does not talk about how to share data between components.

And this is exactly what we are going to talk about in this article. There are many ways to use AngularJS in AEM and we’ll see one of those. Here are the steps/activities that we’ll follow in this article to develop AngularJS based AEM components:

1. Create demo AEM TODO application (similar to http://todomvc.com/examples/angularjs) structure along with client library for AngularJS, bootstrap.
2. Create an AngularJS template.
3. Create an authorable TODO AngularJS component with controller etc. that can be dropped on any page that is created using AngularJS template (as mentioned above).

Before you start, I would recommend to download/cone github project that I have created for this article so that you’ll be bale to follow this article nicely. Download project from:


I have also created an AEM CRX package for this application that you can directly install in AEM. I’ll recommend to install package directly rather than creating folders/file manually. Once you are comfortable with code base and structure you can do it from scratch later to boost your understanding.

1. Demo AEM TODO app structure & client library

Once you have installed CRX package for this demo application you should have a structure similar to shown below. You can find CSS and JS files shown in this diagram on my github project (https://github.com/suryakand/aem-angularjs1-todo-sample).






2. Create an AngularJS template

As we saw earlier AngularJS and AEM have different recommendations for building an application (composing a page) and therefore if we want to use AngularJS with AEM then traditional approach of developing templates and components will not work. We have to structure our template and components in different way so that at runtime when a page is rendered it assembles JavaScript code e.g. controllers (from individual components) attach route to views and controllers etc. based on dropped components.

Look at the template (page component) under /apps/sif/components/angular/todo-sample/ng-page-todo. This template extends from “mobileapps/components/angular/ng-page” templae (provided by AEM for mobile apps.  If you explore ng-todo-page template, you’ll notice that we have following structure:




Let’s look at each file one by one and understand how this structure helps us to create an AngularJS based page in AEM.

A. ng-page-todo..jsp: this is the main script file that will be invoked whenever a page is requested (created using “ng-page-todo” template). Structure of this file is very simple and it includes 2 other scripts file head.jsp and body.jsp.

Important thing note here is usage of ng-app attribute. Notice how angular attribute “ng-app” is initialized on “html” tag. “ng-app” is initialized by a page property variable ${applicationName}and this will be our AngularJS application name.

B. head.jsp: this is inherited from base template “mobileapps/components/angular/ng-page” and primary use of this file is to include CSS clientlibs define in “css_clientlibs.jsp” file.

C. body.jsp: This is an important file and lot of things are going on here so, pay attention. This will has following functions:
  • Includes a header.jsp and footer.js where you can define common header and footer for your application.
  • Based on WCMMODE (EDIT, PREVIEW) render views. If the WCMMODE is EDIT then include template.jsp(which will render component’s markup directly) and if WCMMODE is disabled then just render tag which will be used by AngularJS application to render views. This part is very important to understand before we move to other files. The reason because of which we want to include either template.jsp or is because the way AngularJS router works. When we are in EDIT mode we don’t have access to everything that AngularJS needs to assemble and run application (controller, services etc.).
  • Includes JavaScript client libraries (js_clientlibs.jsp)
  • Includes “angular-app-module.js” where we define application module and config for angular application. This file is also responsible for initializing “$routeProvider” with appropriate views and controllers with the help of scripts written in “angular-route-fragment.js.jsp”.
  • Includes “angular-app-controllers.js” file, which is responsible for assembling AngularJS controller(s) from dropped components on a page. Controller name is automatically derived from page name (by removing all special characters) 
e.g. resource.getPath().replaceAll("[^A-Za-z0-9]", "")

3. Create an authorable TODO AngularJS component

Now, lets look at the component code for the component that we’ll be dropping on AngularJS template.

Look at the “ng-todo-list” component under /apps/sif/components/angular/todo-sample. This component extends from “mobileapps/components/angular/ng-component” (provided by AEM for mobile apps.  If you explore ng-todo-list component, you’ll notice that we have following structure:





Let’s look at each file one by one and understand how this structure helps us to create an AngularJS based component in AEM.

A. “ng-todo-list.jsp”: this is the main script file that is responsible for rendering content and will be called first. Structure of this file is very simple and it includes only 1 script file template.jsp where we have markup for our component.

B. “overhead.jsp”: if our component needs any initialization which is independent of AngularJS then use this script file. In our case we are not doing much here.

C. “controller.js.jsp”: This is where we’ll write AngularJS controller code. Please note that each component will have its own “controller.js.jsp”. When multiple components are places/dropped on a page/template, the script “angular-app-controllers.js” which part of template will club controller code into one controller. 

Try to look at the source (view source) of a page (created when you installed sample application). Also, look at the generated JavaScript files (todo-app.angular-app-module.js and todo-app.angular-app-controllers.js) in source code. Also, try to repeat same process (look at view source) by disabling WCM Mode (append wcmmode=disabled as query parameter).  e.g.  http://localhost:4502/content/en/todo-app.html?wcmmode=disabled

I have tried to keep this article simple to you a head start and information about how you can build AngularJS applications in AEM. If find something that can be improved as part of this application or in sample code feel free to leave a comment or write directly to me and I’ll try to work on that.

Thanks for reading!!!


AEM - Query list of components and templates

Some time while migration or for auditing purposes we need to know where a particular component/template or any other resource have been used. Have you ever wondered about how to fetch a list of all the paths where a particular component or template is being used?

I this short article I am going to show a quick way to write a utility with which you can get list if paths (actual resource nodes) where a targeted resource is being used.

Here is the code snipped:

publicvoid getResourceListHelper(Resource parentResource, String targetSearchPropertyKey
String targetSearchPropertyValue, Listcomponents) {    

Resource childResource;
for (IteratorresourceChildrenIter = parentResource.listChildren(); resourceChildrenIter.hasNext(); 
getResourceListHelper(childResource, targetSearchPropertyKey, targetSearchPropertyValue, components)) {
childResource = (Resource) resourceChildrenIter.next();
ValueMap childProperties = (ValueMap) childResource.adaptTo(ValueMap.class);

if (targetSearchPropertyValue.equals(childProperties.get(
  targetSearchPropertyKey, String.class))) {
    components.add(childResource);
}
}

}

Code is pretty much self explanatory but, I’ll give you a quick walkthourhg:
  1. Here we have Java method take 4 arguments
    • parentResource: this is parent resource/node under which we want to perform search.
    • targetSearchPropertyKey: property name under “parentResource” and its sub-resources (child nodes) that we want to search for
    • targetSearchPropertyValue: value of property that should be matched
    • components: This is a collection/list where all the resources after search is completed will be accumulated.
  1. There is a recursive call to navigate child resources/nodes under given parent resource
  2. Method will not return anything rather result (resources that matched search) will be accumulated in components collection.
Let’s look at one quick day-to-day scenario. Let’s say we have a web site create in AEM under /content/sites/demo and we want to know all the nodes where “title” component is used. To get this information we need to call:

getResourceListHelper(demo, “sling:resourceType”, “app/components/title”,  resourceList);

Thanks for reading!!!

First Prize Winner at AT&T Hackathon and Wonderful Experience


This past weekend I participated in a Tech IoT Hackathon organized by AT&T and IBM at San Francisco and won First Place. It was 2 sleepless days and it got paid off well by winning the competition.

It was 2 day event where participants/teams were expected to preferably stay in facility provided by organizers and code their respective idea. The idea was to use AT&T and IBM platform (e.g. M2X, Watson, Bluemix etc.) along with some IoT/Sensors/Micro Computer Boards (e.g. Raspberry Pi) to solve day-to-day problem.

It was a nice experience and opportunity to interact with lot of talented people around the world. Other teams also came with nice ideas and they also implemented it nicely.

A big thanks to organizers, all participants and everyone who supported this event and shared their ideas during the event, it was very inspiring and more food for thoughts.







More details about event:




I'll try to keep doing this more often now :-)

Thanks for reading!

Certification - IBM Bluemix Cloud Platform

Recently I took IBM Bluemix Cloud certification and cleared it. If you have been working on cloud platform, then I would recommend to go for it as it’ll enhance your knowledge. If you know cloud foundry and have done hands on development with it then this certification will be a piece of cake for you because Bluemix is based on cloud foundry.

This certification covers following areas:
·      Hosting Cloud Applications
·      Planning Cloud Applications
·      Designing and develop Cloud Ready Applications
·      Enhancing Cloud Applications using Managed Services
·      Using Data Services
·      Cloud Application Security
·      Using DevOps toolchain to develop and deliver cloud applications
·      Managing a running cloud application

Cloud Platform specially for PaaS offerings is next big thing that developer community is going to fall in love with. I enjoyed doing hands on development on Bluemix and while doing that it helped me a lot in terms of reducing effort for recurring things that I used to do in past (e.g. setting up a tomcat server, database etc.), with Bluemix this was just a click of button. Another big benefit of these cloud platforms is you don’t need to worry of network, DNS etc. and you can expose your application for everyone (in controller way) so that you don’t have keep your servers running.

I started working on IBM Bluemix cloud a few months’ back on a customer project where my team was responsible for developing Node.js based micro service using Loopback and IBM API Connect.

The journey started with figuring out what is the right way of doing it because most of us had no experience of Cloud development. We tried and tested few frameworks and patters to develop a proof of concept in mid 2016. Here are few key highlights:

1)    Leverage Loopback framework (from StrongLoop: https://strongloop.com) to create micro service. It is a Node.js based framework that implicitly have lot of great features for development of REST services
2)    Having knowledge of cloud platform is key and IBM Bluemix has boilerplate for Loopback based project
3)    When we started, we did not have real servers, real data source so, Bluemix helped us to spin new server and write mock data source to continue our development

This year we pretty much close to send our work to production and we all are very excited about it.

So, after spending a good amount of time with Bluemix Cloud platform I thought it is good time to assess knowledge. If you are planning to go for this certification, my recommendation would be to play around with Bluemix for at least a month (you can get a free trail account) and develop some sample applications. This will give you more confidence and at the same time you’ll learn some cool stuff which is real value of certification.


Thanks for reading!

Dynamically Loading Angular 2 components

In Angular 2 Component templates are not always fixed. An application may need to load new components at runtime. This tutorial shows you how to use ComponentFactoryResolver to add components dynamically.

Let’s start with a requirement about why we need to load components dynamically in Angular 2. To understand that first we need to understand how components in Angular 2 are rendered in HTML DOM, how are they organized in Angular 2 module. I’ll not go into lot of detail about Angular 2 modules and how code is organized but, will try to give you fair understanding so that you can see why we need to load components dynamically.

An Angular 2 application typically will have one or more NgModule (@NgModule) and there will be at least one root NgModule. Many Angular libraries are modules (such as FormsModule, HttpModule, and RouterModule). Many third-party libraries are available as NgModules (such as Material Design, Ionic, AngularFire2). Not necessarily but, by convention, the root module class is called AppModule and it exists in a file named app.module.ts. NgModule consolidate components (bootstrap and entry components), directives, and pipes into cohesive blocks of functionality, each focused on a feature area, application business domain, workflow, or common collection of utilities.

Modules can also add services to the application. Such services might be internally developed, such as the application logger. Services can come from outside sources, such as the Angular router and Http client. Modules can be loaded eagerly when the application starts. They can also be lazy loaded asynchronously by the router. Here is an example of NgModule:

import{ NgModule }     from '@angular/core';
import{ BrowserModule } from '@angular/platform-browser';
import{ AppComponent } from './app.component';

@NgModule({
 imports:     [ BrowserModule ],
 declarations:[ AppComponent, Component1, Component2 ],
 bootstrap:   [ AppComponent ],
 entryComponents:   [ EntryComponent ],
 providers:[ Services/Providers ],
})
exportclass AppModule {}

The @NgModule decorator defines the metadata for the module. This page takes an intuitive approach to understanding the metadata and fills in details as it progresses. The metadata imports a single helper module, BrowserModule, which every browser app must import. BrowserModule registers critical application service providers. It also includes common directives like NgIf and NgFor, which become immediately visible and usable in any of this module's component templates.

The bootstrap list identifies the components (in this case AppComponent), called root component. When an Angular 2 application is starting, root component (s) are first one that gets attached/rendered in HTML DOM, this is the line that does this:

platformBrowserDynamic().bootstrapModule(AppModule);
All other components can be used within the scope of root component (i.e. other components must be loaded in root component). Here is a example HTML code where AppComponent will be bootstrapped:

<html>
<head>

</head>
<body>
<app></app>
<body>
</html>

Note that, and Angular 2 application can have more than one root component too that will have its own unique selector (e.g. ).

Now, let’s say we want to render “Component1” (with selector ) outside of the root component (app) then it’ll not render and this is where we need “ComponentFactoryResolver” to dynamically load component. This is just one scenario where you’ll need dynamic component loading. Think of another situation where you are getting HTML from some external service and that HTML contains some selectors that needs to be rendered as Angular 2 components then again we’ll need dynamic component loading.

So, how do we load components dynamically?

We need 2 main things:
  1. A dynamic component loader class that make use of
  2. A hook during application bootstrap which will use dynamic component loader

  1. Dynamic Component Loader class: DynamicNg2Loader class that will load Angular 2 components dynamically at runtime outside of the root component. This is needed because in AEM a component (Angular 2) can be used many time and that too outside of root component's scope. If a component is outside of root component's scope Angular 2 will ignore it and will not render and that's why we need this DynamicNg2Loader. Here is sample code for DynamicNg2Loader:
import{ REMOVED_FOR_SIMPLICITY} from '@angular/core';

exportclass DynamicNg2Loader {
   privateappRef: ApplicationRef;
   privatecomponentFactoryResolver: ComponentFactoryResolver;
   privatezone:NgZone;
   privateinjector:Injector;

   constructor(privatengModuleRef:NgModuleRef<any>){
       this.injector = ngModuleRef.injector;
       this.appRef =this.injector.get(ApplicationRef);
       this.zone =this.injector.get(NgZone);
       this.componentFactoryResolver =this.injector.get(ComponentFactoryResolver);
       console.log(this.componentFactoryResolver);
   }

   /**
    * Render component in DOM
    */
   loadComponentAtDom<T>(component:Type<T>,dom:Element, onInit?:(Component:T)=>void): ComponentRef<T>{
       let componentRef;
       this.zone.run(()=>{
           try{
               let componentFactory =this.componentFactoryResolver.resolveComponentFactory(component);
               componentRef = componentFactory.create(this.injector,[], dom);
               onInit && onInit(componentRef.instance);
               this.appRef.attachView(componentRef.hostView);
           }catch(e){
               console.error("Unable to load component", component,"at", dom);
               throw e;
           }
       });
       return componentRef;
   }
}

  1. Hook that will use Dynamic Component Loader: Bootstrap AppModule and use DynamicNg2Loader loader to render other components which are outside of root/bootstrapped component's scope. Here is sample code:

const componentList ={'text-area': TextAreaComponent,'task-list': TaskListComponent,'about': AboutComponent,'task': TaskListComponent};

platformBrowserDynamic().bootstrapModule(AppModule).then(function(ng2ModuleInjector){
   console.log("I have a reference to the injector : ", ng2ModuleInjector);
   let ng2Loader =new DynamicNg2Loader(ng2ModuleInjector);

   Object.keys(componentList).forEach(function(selector){
       let container = document.getElementsByTagName(selector);
       if(container){
           for(let i =0; i < container.length; i++){
               let element = container.item(i);
              let compRef = ng2Loader.loadComponentAtDom(componentList[selector], element,(instance)=>{
                   console.log('Text Area Component Loaded');
               });
           }
       }
   });
});


I hope you have enjoyed reading this post!

Part 2 - AEM with Angular 2 – Building project using Gulp and Maven

Angular 2 is modular in nature, what that means is there are number of smaller modules that you can choose to mix and match based on what is required in your project. When you are developing Angular 2 application outside of AEM this is not a big challenge in terms of building and shipping an Angular 2 application out of the door because there are many build tools (e.g. Angular 2 CLI) are available to take care of bundling and packaging these angular modules and dependencies. But, when we are in AEM, components are just building blocks and content authors can drag-drop these components anywhere and as many times as they need so, reusability is key i.e. authors should be able to use Angular 2 components in same way as regular AEM component. So this mean build tools that are available today for Angular 2 application can’t help because those tools need to know upfront where components have been used and then these tools compiles applications but, in AEM this is different because authors can drag and drop Angular 2 components anywhere on a page so after components have been developed by developers (as we discussed above).

To address this problem we need to write custom build script using gulp (we can use grunt as well) and Maven (gulp task will be invoked from Maven build using “exec-maven-plugin”). Before having a look at actual build scripts let’s look at project structure and then we’ll see how build script will compile typescript files, place them in different folder and package it for deployment in AEM.

Let’s get started!!!

I am using AEM Maven archetype for creating a multi-module maven project which contains following projects:

Image may be NSFW.
Clik here to view.

Out of these projects we’ll be focusing on 3 projects:
  • “core” - contains servlet to render Angular 2 template, that we’ll see later in this series of tutorial
  • “ui.apps” - contains template, pages, component, Angular 2 dependencies (in etc folder)

Project Structure - ui.apps
First we’ll look at “ui.apps” project which contains major chunk of our code. Here is expanded view of ui.apps project:

Image may be NSFW.
Clik here to view.


  1. Page template - this is folder (/structure/page) which contains page AppModule (https://angular.io/guide/bootstrapping) and renderer component (app or root component) for template that we’ll be using for creating pages where we’ll drop Angular 2 components. This is a simple page template which has few columns and parsys where we can drop components. The only important thing to note here is “systemjs.config.js” file (which is included in page’s head section) and this file is responsible for loading Angular 2 dependencies on page. We’ll talk more about “systemjs.config.js” file later in this article (point # 13)
  2. Components - this folder (/apps/ngaem/components/content) contains all AEM components including Angular 2 (e.g. ng-app, text-area) and non Angular components.
  3. ng-app component - this is the main Angular 2 root component that will be bootstrapped by Angular 2 library when application is loading. You can read more about what is root component in Angular 2 here: https://angular.io/guide/bootstrapping
  4. text-area component - this an Angular 2 + AEM component that leverages power of Angular 2 but, it is an AEM component that authors can drag and drop in parsys like any other AEM component. We’ll be discussing more and more about this component to learn how to develop AEM component using Angular 2
  5. Angular 2 dynamic component loader - this is an important file which allows us to render/load Angular 2 components dynamically in HTML DOM. If you want to read more about what is Dynamic Component loading and why we need it, please refer link: http://suryakand-shinde.blogspot.com/2017/06/dynamically-loading-of-angular-2.html
  6. main.ts - this is the first file that is loaded by “systemjs.config.js” while initializing Angular 2 application. It loads AppModule (will discuss it more on it later) which eventually loads root component and with the help of Dynamic Loader (that we discussed above) it loads other Angular 2 components
  7. lib folder (/etc/designs/ngaem/lib) - contains Angular 2 dependencies (javascript libraries)
  8. build folder - Our gulp build script has 2 tasks “gulp build” and “gulp build:aem”. “gulp build” task will build project so that we can run Angular 2 project without deploying it to AEM (i.e. run like any regular Angular 2 application). This task (gulp build) will compile .ts file to JavaScript files, copy html templates for Angular 2 components, copy a different version of “systemjs.config.js” file (that loads dependencies from /build/lib folder) and index.html into build folder. You can run generated application in build folder using following command:
npm run start

  1. gulpfile.ts (frontend build script) - this file contains gulp build tasks (e.g. command: gulp build and build:aem). “gulp build” task will generate files and copy it into “build” folder so that we can run application without deploying to AEM. “gulp build:aem” task will also do same thing as “gulp build” task but, it’ll update “templateUrl” of Angular 2 components so that it leverages Sling Servlet (that we’ll talk later) to load Angular 2 component’s views. E.g before running this task if .ts file has:

@Component({
   selector: 'text-area',
   templateUrl: '/apps/ngaem/components/content/text-area/text.html'
})

Will be changes to following after “gulp build:aem” finishes:

            core_1.Component({
                   selector: 'text-area',
                   templateUrl: '/bin/ngtemplate?path=/apps/ngaem/components/content/text-area/text.html'
               })

  1. pom.xml (Maven file) - this file contains all Java dependencies for AEM project and maven plugin “exec-maven-plugin” to execute gulp task (gulp build:aem) during packaging (creating CRX package) of project that can be deployed in AEM
  2. tsconfig.json and typings.json - Configuration file for Typescript compiler. You can read more about various configuration options that you can use for customizing typescript compiler and instruct to load type definations, refer this link https://www.typescriptlang.org/docs/handbook/migrating-from-javascript.html
  3. package.json - this is a standard file for node project or projects that want to leverage node modules (like gulp, node typescript compiler etc.) for a project during build time. Since we are using Gulp and Typescript therefore we need this file. In this file we have defined node module dependencies (they way we define dependency for Java projects in pom.xml)
  4. systemjs.config.js - SystemJS config file that will load Angular 2 dependencies on AEM page template (from /etc/designs/ngaem/lib folder) and bootstrap Angular 2’s main module (AppModule) into HTML DOM

Project Structure - core
“Core” is another maven module/project which mainly contains Java code (OSGi Services, Sling Models, Filters, Servlet). For this tutorial the only important piece of code that we’ll be focusing on is “AngularTemplateServlet”.

Image may be NSFW.
Clik here to view.
  1. AngularTemplateServlet - Angular 2 components can have views either as part of same Javascript file or it can be loaded from external html file. To make it simpler and align with how we develop component in AEM we’ll load views from external files. Here is Angular 2 component code snippet that shows how we load views in Angular 2 component from external html file:

@Component({
   selector: 'text-area',
   templateUrl: '/apps/ngaem/components/content/text-area/text.html'
})

As you can see, “templateUrl” property is trying to load template from /apps folder (which is possible but, not recommended). If we have to load views from /apps folder this means we have to expose our /apps folder for outside world and this might lead to serious problems. To overcome this, what I have done is, I have created a very simple Sling Servlet that reads content on html files in /apps/ and feeds it to Angular 2 component. To this we need to update “templateUrl” property as follows:

core_1.Component({
                   selector: 'text-area',
                   templateUrl: '/bin/ngtemplate?path=/apps/ngaem/components/content/text-area/text.html'
               })


This path update is done by the gulp task “gulp build:aem” (which gets executed as part of maven build) so that you don’t have to do it manually.

With this now you are aware of project structure, various files, build tasks and why we have organized project like this. In next part I’ll focus on component development, please stay tuned.


Thanks for reading!!!

Part 1 - Angular 2 with AEM – Introduction, Challenges, Installation and Prerequisite


I am writing about AEM after long time and there is reason for that :-) I was learning something else interesting and that interesting thing is frontend technologies. So far in my professional career I have spent most of my time in designing and developing server side applications mainly using Java technology stack (Spring, ORM, JMS and what not) with basic UI development using JSP/JSF, HTML, Velocity, Sightly, jQuery, ExtJS etc. Also, in last few years I have delivered many applications which is combination of AEM, other CMS and Java backend but, I felt that still there is something missing and that missing piece is end to end knowledge of frontend development. So that’s what I was focusing on frontend development in recent past. I thought it is good time to combine frontend development knowledge with AEM and come up with something interesting that might help my friends out there.

In this multi series post (and tutorial in coming parts of this series), I’ll show you how you can use Angular 2 in AEM. There are lot of post already out there then, why do I need a separate post on this topic? I see a lot of articles on web that are trying to address how to use AEM and Angular2 together but, most of them are either incomplete or just shows how to create clientlib for Angular2 dependencies or how to create a plain vanilla Angular 2 component (not AEM component) that authors cannot drag and drop (like regular AEM components). For almost last 2 years I spent a good amount of time learning front end technologies like (Angular 1.x, Angular2, ES6, Typescript, Ionic 1 & 2, Native Script etc.) along with Node.js and I felt this is the time when I know both worlds; AEM and Angular 2 quite well and I can attempt to develop something using AEM and Angular 2 by harnessing power of both. You’ll have to be focused and patient while going through this multi series post.
For those who don’t know what is AEM and Angular 2, I’ll being with quick non-technical introduction.
What is AEM?
Adobe Experience Manager (AEM), is a comprehensive content management platform solution for building websites, mobile apps and forms - making it easy to manage your marketing content and assets. AEM empowers business users and non-technical staff to create and manage content as per business needs.
What is Angular 2?
Angular is a platform that makes it easy to build applications with the web. Angular combines declarative templates, dependency injection, end to end tooling, and integrated best practices to solve development challenges. Angular 2 comes with almost everything you need to build a complicated frontend web or mobile apps, from powerful templates to fast rendering, data management, HTTP services, form handling, and so much more. Angular 2 is a component centric framework that promotes reusable component development. In Angular 2, “everything is a component.” Components are the main way we build and specify elements and logic on the page, through both custom elements and attributes that add functionality to our existing components.

Since Angular2 is primarily based on Typescript therefore we get all the benefits of a typed language. Angular 2 and TypeScript are bringing true object oriented web development to the mainstream, in a syntax that is strikingly close to Java 8.
Why use Angular 2 in AEM?
For any delivery channel (web, mobile etc.) content is no doubt key but, at the same time it is also very important how that content is presented to users (this is where user experience comes). No matter how good content is, if it is not available to users in the format that is easy to read, understand, in right context with good performance then it will lose its value. This is where Angular 2 helps to design more robust and performant user experience using modern technology and standards.

Challenges that I faced while integrating Angular 2 with AEM
  1. AEM and Angular 2 components are different: Angular 2 is modular in nature, what that means is there are number of smaller modules that you can choose to mix and match based on what is required in your project. When you are developing Angular 2 application outside of AEM this is not a big challenge in terms of building and shipping an Angular 2 application out of the door because there are many build tools (e.g. Angular 2 CLI) are available to take care of managing these smallers angular modules and dependencies. But, when we are in AEM, components are just building blocks and content authors can drag-drop these components anywhere and as many times as they need so, reusability is key i.e. authors should be able to use Angular 2 components in same way as regular AEM component. So this mean build tools that are available today for Angular 2 application can’t help because those tools need to know upfront where components have been used and then these tools compiles applications but, in AEM this is different (as we discussed above).

To address this problem we need to write our custom build script using gulp (we can use grunt as well) and Maven.

  1. Angular 2 application root component bootstrapping and AEM component model: typically an Angular 2 application has one root element where main application is bootstrapped. All other Angular 2 components have to be child of this root element else Angular will not detect them. For example:
Image may be NSFW.
Clik here to view.

In above diagram is application root component and everything else (including components imported from other Angular 2 modules) needs to be rendered inside . This is big challenge because we want parsys to be available inside an Angular 2 component, and we should also be able to drag-drop component in that parsys...this sounds complex huhhhh!!!  Yes, it is little complicated and took some time for me as well to figure out how this can be achieved.

To overcome this, we need to use some internal Angular 2’s core classes like “ComponentFactoryResolver” and “ApplicationRef” to change the way Angular 2 application is bootstrapped in normal applications and loading Angular 2 components dynamically at runtime.

  1. Angular 2 component & authoring: one of the key capability of AEM component is content authors have full control over text/labels, they can change text without IT team’s involvement but, on the other hand Angular 2 component need to know the text values (labels etc.) upfront and if there are any changes then we need to rebuild application or at the very best we can externalized labels into translation bundle files so that we don’t need to change code but, still this is not aligned with how component authoring works in AEM.
To overcome this, we need to use Angular 2 features like @Input so that we can read JCR properties and pass it to Angular 2 component’s template for rendering.

  1. Angular 2 component’s view/template: Let’s look at a simple Angular 2 component

import {Component, OnInit} from "@angular/core";

@Component({
   selector: "app",
   templateUrl: "/apps/ngaem/content/ng-app/app.html"
})
export class AppComponent implements OnInit {
   ngOnInit() {
       console.log("Application component initialized ...");
   }
}

As you can see, Angular 2 component’s actual view is in some external template “app.html”. If we want to treat an Angular 2 component as AEM component then app.html in this case will become sightly template (or JSP, JSP is not recommended to use because we want to leverage HTML5 features when using Angular 2). If you notice, the template is directly fetched from /apps folder (which is a secured folder on server) and we don’t want users to have direct access to anything that resides in /apps folder in publish environment.

To overcome this, we need to write a custom Sling servlet that can serve view/template to angular component. This servlet will be responsible for reading view file (e.g. app.html) from apps folder and send text/html response back to caller (in this case Angular 2 component). Here is a quick example how view will be referenced from Angular 2 component:

@Component({
   selector: "app",
   templateUrl: "/bin/ngtemplate?path=/apps/ngaem/content/ng-app/app.html"
})

/bin/ngtemplate is Sling servlet that serves the content of app.html

  1. Preview of Angular 2 component in author environment: a component can be dropped by an author on a page (when they are in edit mode) and they can preview how actual pages looks like (in preview mode). When an Angular 2 component is dropped on a page it needs to be intercepted by Angular 2 library and bootstrap or attach it to DOM. Even though the Angular 2 library is available on page but, for some weird reason Angular was not able to bootstrap/attach component’s view when I was in EDIT and PREVIEW mode. For Angular component to work I have to use wcmmode=disabled in query string.

Image may be NSFW.
Clik here to view.
Image: Angular 2 TextArea Component in EDIT/PREVIEW mode

Image may be NSFW.
Clik here to view.
Image: Angular 2 TextAreaComponent in DISABLED mode

To overcome this I used “wcmmode”, and if wcmmode is either EDIT or PREVIEW directly render the .html file of component and in this case data will not be binded i.e. {{some_angular_variable}} will be rendered rather than actual value. Here is a quick code snipped that show how I am loading template in EDIT/PREVIEW mode v/s when wcmmode is DISABLED (need to pass query parameter in URL ?wcmmode=disabled):

<div data-sly-use.logic="logic.js">
   <div data-sly-test="${wcmmode.edit || wcmmode.preview}">
<!-- Render HTML file in EDIT and PREVIEW mode-->
   <section data-sly-include="text.html"></section>
   </div>
   <div data-sly-test="${!wcmmode.edit && !wcmmode.preview}">
<!-- Render by Angular 2 component in wcmmode=DISABLED mode-->
   <text-area text="${properties.text}"></text-area>
   </div>
</div>

  1. Speed of development and validating Angular functionality and AEM functionality independently: developer are more productive when they can make code changes and quickly verify it without redeployment. We feel more confident when we have suite if unit tests to verify our code so that undesirable code changes can be caught during development. When we want to use Angular 2 in AEM we have to deal with 2 different approaches for developing and testing application because Angular 2 development itself is quite different than AEM and Angular 2 is mostly used without AEM. Challenge come when we want to use both together. Both Angular 2 and AEM components should be unit testable (Angular 2 testing using Mocha, Chai, Sinon etc.) AEM with Junit, Mockito etc. and developer should be able to test their code quickly without redeployment.
To do this, I have leveraged power of frontend build tools Gulp to create two separate Angular 2 build tasks:
  1. One for AEM which compiles Typescript files, updates template path in Angular 2 component with Sling servlet context path, copy resources that are needed for creating AEM CRX package
  2. Second task for building pure Angular 2 project that can be built and tested outside of AEM using a local lite-server https://github.com/johnpapa/lite-server

  1. Importing Angular 2 dependencies and loading them on AEM template: Angular 2 primarily recommends to use Typescript for Angular 2 application development although you can use Javascript as well. If you are from Java background you’ll feel like you are write Java code when you use Typescript and you get all most all the benefit of a typed language. When you want to use Angular 2 in AEM you need to bring in all dependencies of Angular (e.g. @angular/core, @angular/common, @angular/compiler, @angular/http etc.) in AEM and make them available on your pages via template. These dependencies are modular and can be loaded on demand (i.e. when it is really needed by your Angular 2 component, services/providers). For example, you are writing a simple Angular 2 Text component that does not talks to any REST service then you really don’t need @angular/http module loaded on your page. In AEM there are 2 ways you can achieve this one using clientlib (define inter-dependencies between various modules) or you can use frontend bundle packaging and loading tools like Webpack or Systemjs.

In this series of tutorial, I am going to use Systemjs. Systemjs allows you load JS dependencies from AEM.

I just wanted to give you an overview of some of challenges that I faced while integrating Angular 2 with AEM so that you can follow rest of parts in this series with a better context and background. I’ll cover each of these challenges in more other parts of this series so that you can get more knowledge about these topics.

What you need to know to follow remaining parts of this series?
If you know Java, AEM and Angular2 you can easily follow this series of tutorial but, if you want to do some hands on along with going through this tutorial then I’ll recommend you to get fair understanding of following:
  1. Good understanding of Java and AEM (components, templates, Sling Servlet etc.)
  2. Good understanding of JavaScript and Typescript (https://www.typescriptlang.org)
  3. Good understanding of Angular2 (https://angular.io/docs)
  4. Good understanding of Systemjs (https://github.com/systemjs/systemjs)
  5. Basic understanding of frontend build tool (gulp:http://gulpjs.com) and Maven build tool (https://maven.apache.org)

Tools and Installation
Also, to build and run project you’ll need to install:
  1. Java /JDK 1.7 or later - https://www.java.com/en/download
  2. AEM - If you don’t have AEM already then getting it might be tricky because it is licensed. You can look at this thread https://forums.adobe.com/thread/2322257 and another link https://helpx.adobe.com/support.html#/top_products

I hope you have got some context now and will enjoy next parts in this series. Be ready with installation of tools that I have listed above. Stay tuned for a github repository with sample code and links for next parts in this series.

Part 2: http://suryakand-shinde.blogspot.com/2017/06/part-2-aem-with-angular-2-building.html


Thanks for reading!

AEM 6.3 - Bundle Whitelisting - Deprecation of administrative authentication

I stumbled on an issue when I was using neab with AEM 6.3. I created few neba ResourceModels and when I tried to access neba Model Registry, I got an error (java.lang.IllegalStateException: org.apache.sling.api.resource.LoginException: Bundle org.eclipse.gemini.blueprint.extender is NOT whitelisted):

Image may be NSFW.
Clik here to view.
Image: neba Model Registry Menu


Image may be NSFW.
Clik here to view.

Image: Error Screen

NOTE: Neba team has already fixed it on their development branch and we don’t need to explicitly add whitelisting configuration for neba bundle.

Here is the reason for error
Originally the ResourceResolverFactory.getAdministrativeResourceResolver and SlingRepository.loginAdministrative methods have been defined to provide access to the resource tree and JCR Repository. These methods proved to be inappropriate because they allow for much too broad access.
Consequently these methods are being deprecated and will be removed in future releases of the service implementations.
The following methods are deprecated:
  • ResourceResolverFactory.getAdministrativeResourceResolver
  • ResourceProviderFactory.getAdministrativeResourceProvider
  • SlingRepository.loginAdministrative
The implementations we have in Sling's bundle will remain implemented in the near future. But there will be a configuration switch to disable support for these methods: If the method is disabled, a LoginException is always thrown from these methods. The JavaDoc of the methods is extended with this information.

Whitelisting bundles for administrative login

In order to be able to manage few (hopefully legit) uses of the above deprecated methods, a whitelisting mechanism was introduced with SLING-5153 (JCR Base 2.4.2).
The recommended way to whitelist a bundle for administrative login is via a whitelist fragment configuration. It can be created as an OSGi factory configuration with the factoryPID org.apache.sling.jcr.base.internal.LoginAdminWhitelist.fragment.

E.g. a typical configuration file might be called org.apache.sling.jcr.base.internal.LoginAdminWhitelist.fragment-myapp.config and could look as follows:

whitelist.name="myapp"
whitelist.bundles=[
   "com.myapp.core",
   "com.myapp.commons"
]

In general try to avoid using administrative login if you are writing code in your bundle but, let’s say you are using some third party bundle then in that case you can add a configuration to get it working as explained above.

Hope this information will save your time. Thanks for reading!!!

Part 3 - AEM with Angular 2 – AEM Component development

In previous 2 parts we have seen challenges, project structure and build process for an AEM + Angular 2 project. In this post we’ll focus more on Angular 2 component development and will do a deep dive into sample component development so that we can understand various concepts involved in AEM + Angular 2 component development.

Let’s begin with defining some basic requirement for sample AEM + Angular 2 component that we’ll develop in this article.

Problem statement for our component
  • Develop a simple AEM + Angular 2 component to find location details based on IP address
  • Use Angular 2 Http service to get location detail (by calling remote REST web service) based on IP address provide via HTML form input
  • Display data/response from REST web service using Angular 2 binding on UI
  • Provide capability to authors so that they can change labels (e.g. label of “Search” button and label for “Location Detail” card/result) using component’s dialog

Image may be NSFW.
Clik here to view.
Image 1: “ip-location” AEM + Angular 2 in action on a AEM page

The idea behind developing this component is I want to cover AEM Authoring capabilities as well I want to show you how you can use Angular 2 features (like Http service, data binding etc.)

Before we proceed, I assume that you already know how to develop pure and simple AEM component and you also know about Angular 2 framework. Both of these skills are must to understand rest of this tutorial.

So, let’s get started!!!

We’ll be developing this component in 2 phases:
  1. First we’ll develop Basic AEM component (no Angular 2 code involved here)
    1. Create an AEM component in your application the way you have been creating AEM components (let’s say component name is ip-location). ip-location component will have a sightly template file called as “ip-location.html” and we are going to modify this file in next step.
    2. Create a component dialog with 2 text fields “./goBtnLabel” and “./localtionLabel”, these are the fields that author will use to input text for “Search Location” button and “Location” label (as shown in above snapshot)

  1. Modify basic AEM component and add Angular 2 code: In this step we’ll modify this component by adding an Angular 2 component TypeScript and template file. Here are the steps:
    1. Create a file in component folder called “ip-location.component.ts”, this is the TypeScript file that will have Angular 2 component’s TypeScript code. Here is screenshot of code (I’ll share github URL towards end of this post):

Image may be NSFW.
Clik here to view.
Image 2: “location.html” Angular 2 template/view

    1. At 10 we have added Angular 2 annotation/decorator for component and at line 12 we are loading a template/view for this component. NOTE the templateUrl path, we are not loading the “ip-location.html” file (which is actually component’s original sightly file), rather we are loading a different file called as “location.html”. This is because Angular 2 template will not render properly in EDIT mode and that’s why we’ll include this “location.html” explicitly from “ip-location.html” (as show below, line# 4). Have look at the test at line number 2 and 11.

Image may be NSFW.
Clik here to view.
Image 3: “ip-location.html” Sightly view that will load Angular view in PREVIEW mode

Image may be NSFW.
Clik here to view.
Image 4: “location.html” Angular 2 template referred and rendered by Angular 2 component

    1. At line # 8 (Image 3), we have 2 HTML attributes “goBtnLabel” and “localtionLabel” populated using JCR property values (i.e. values entered by authors via component dialog). Since we don’t have a direct way to access JCR properties from Angular 2 component that’s why we are setting it as HTML attributes so that we can read it in our component’s TypeScript code

    1. At line # 29 (Image 2), we have define a TypeScript function “getProperty()” that is responsible for reading  2 HTML attributes “goBtnLabel” and “localtionLabel” that we have defined at line # 8 (Image 3) and assigned it to Angular 2 component variables defined at line # 16 and 17 (Image 2) respectively. This is a hack to pass authored properties from JCR/properties to Angular 2 component but, so far it has worked perfectly fine for me

    1. Once “goBtnLabel” and “localtionLabel” varibale in TypeScript code is populated/binded we can use these variables in Angular 2 templates as shown below line# 6 and 16 (Image 4)

    1. In our Angular 2 component’s TypeScript code (line# 38, Image 2) we have defined a search() function that uses Angular 2’s “Http” service to call a REST API hosted by https://freegeoip.net. To call this REST API we read IP address entered by user in input field defined in location.html (line 4, Image 4). This input field is binded with an Angular 2 model variable defined in TypeScript file (line # 15, Image 2)

So in this post we have seen:
  • How integrate Angular 2 code with AEM component
  • How to load Sightly template (in EDIT mode) v/s render actual Angular 2 component (in PREVIEW mode)
  • How to pass authored values from Sightly to Angular 2 template and bind it with Angular 2 model
  • How to use Angular 2 services from AEM component

Just to summarize, in this 3 part series (Part 1, Part 2 and Part 3) we have discussed:
  • Challenges to integrated Angular 2 with AEM (Part 1)
  • Benefit of using Angular 2 with AEM (Part 1)
  • Build process of an AEM application that users Angular 2 is different as compared to regular maven build (Part 2)
  • Development of AEM component using Angular 2 (Part 3)

I hope this 3 part series will help you to understand, integrate and develop AEM + Angular 2 applications. If you have any questions or suggestion please feel free to post them.

Thanks for reading!!!


Part 2:

Reference
This sample uses REST API provided by https://freegeoip.net

KONE Hackathon on IoT, Smart Elevators, Smart Lights and Cloud

I have used elevators countless times in my life so far and we all use it multiple times every day. Have you ever thought about how elevators can react in smarter way, does an elevator knows who you are and personalize it for you? I think most of us (including me) have never thought about this (before this hacking event, that I attended) but, trust me there are companies like KONE and many smart people who work there are challenging themselves and pushing hard to make elevator smart.

What exactly KONE is developing?

KONE is building APIs to control, manage and maintain elevators. KONE's APIs are collecting data from various elevators installed in buildings, these APIs are backbone or I can say brain of KONE’s smart elevators. Once data is collected, possibility of harnessing valuable information from that data is endless. Example:

Feed collected data to analytics tools to optimize elevator operation, usage etc.
Apply Artificial Intelligence and Machine Learning to give a smart brain to elevators
Also, KONE is looking forward for better and smarter ways to flow people in building not just in elevators and for this KONE's teams are working with other partners (Philips Smart Lighting and IBM Cloud team).

Okay, that was about smart elevators. Have you thought about how lights can be smart? Most of us till recent past knew that lights/bulbs are just illuminating devices that we can control using hardware on/off switch. Recently we got introduced to few newer ways to control lights using WiFi switches, mobile apps, voice commands/NLP (using Alexa, Google home) etc. but, teams at Philips Lighting are doing great job in reimagining how lights can be more smarter and use them beyond just lighting rooms and spaces.

What exactly Philips Lighting’s team is developing?

How about if a light/bulb is like a tiny computer which is connect to WiFi/Network and it has sensors (temperature, motion, proximity etc.) attached to it that can be controlled using APIs...sounds crazy right? This is exactly what teams at Philips Lighting are working on and these are called as Smart Lights. Lights are everywhere in living room, meeting room, corridor, elevators and even in open space. Since these smart lights are connected to network and are capable to sensing critical information/data these lights can help build solution that can change the way we look at lights.

Identify vacant meeting room based on data captured from sensors installed in light
Guide a user from point A in building to point B by incrementally illuminating lights one after another
Adjust light level using data collected via sensors and more…
Please watch this video for quick overview https://www.youtube.com/watch?v=Fmii9BNgixI

And, finally have you thought of how smart elevators and smart lights together can deliver a personalized user experience? This where IBM (Cloud, IoT and Watson) teams are helping elevator and light industry to revolutionize the whole game by leveraging offerings like Cloud Computing, Cognitive APIs and AI capabilities.

I have no direct relation to either elevator or lighting industry then why I am talking about elevators and lights?

Recently I got an opportunity to participate in a Technology Hackathon organized by KONE where they invited their tech partners Philips Lightings, IBM and developers, designers’ innovators. This 2-day hackathon was focused on leveraging APIs from KONE, Philips Lightings and IBM’s cloud capabilities to think about solutions that can improve our daily life in elevates, offices or at any other place.

I presented an idea that was focused on:

Provide personalized user experienced in large space where every single device (such as elevators) or area (such as hotel rooms) knows who you are using state of art integration of Smart Elevator, Lighting and IBM Watson APIs
No manual interaction with devices or spaces in building
Allow users to create preferences (light ambiance, desk location etc.) and save it in a mobile app. Use these preferences to deliver personalized experience e.g. once user enters a building, lights will guide them to desk, set their preferred ambiance in office room etc. If their desk is changed in future, new place will be adjusted for the user
Provide real time analytics about areas and people (such as age and gender) and perform optimization by capturing people flow in various areas in building
The idea is, every device in building will recognize who the user is, where they are headed to and what are their preferences. This information will be stored as shareable profile and no matters where they go in world based on their profile preferences; allocate room/space in building everything i.e. elevators, lights and other devices will be personalized.

This idea can be applied to any space (offices, buildings, airports etc.). For hackathon, I used hotel industry for implementation. This is how it works for hotel industry:

A user check-in to hotel by scanning a QR code (we had only 3 iBeacon during Hackathon else we wanted to remove QR code from picture so that user don’t even need to scan anything).
Once user do check-in to hotel, a room will be assigned to them based on their booking
As soon as user reaches elevator area/lobby based on which floor his assigned room is, KONE’s elevator API will be called to request an elevator for him. Important thing to note here is, user don’t have push a button to call an elevator because there is an iBeacon in elevator area and apps knows user has floor on a specific floor and makes a call on behalf of user using KONE API
Once user is in elevator user can set his light preferences (using Philips Lighting APIs) in room before he reaches in room
Once elevator is open, Philips Lighting will guide user to his room by incrementally illuminating lights in the direction where his room is
Once user reaches close to his assigned room, he can unlock room using mobile app (again we used iBeacon here)
Periodically and based on sensor triggers capture images in various hotel areas and send those images to IBM Watson’s visual recognition APIs which will analyze captured images and send back information like number of people, age, gender etc. and this information can be used for analytics purposes and optimization of user experience
So, as you can see right from hotel check-in to his room user don’t have to press button for elevator, he is guided to his assigned room and his lights are set before he arrives in room.


I am adding some pictures of mobile application and links to hackathon event:







I hope after reading this whenever you'll see an elevator or light you'll think about how it can be personalized.

To all hacker out there...Happy Hacking!!

All Company Names, Branding, Logo and other information used in this article are owned by respective companies and organization.


Part 4: AEM with Angular 2 - Unit Testing Angular Components & Services

This article is continuation of a multi-part series on AEM + Angular 2 integration. You can read previous 3 articles from this series here:
In previous three parts, we have learned:
  1. Angular 2 & Challenges of using AEM with Angular 2
  2. Project Structure of AEM + Angular 2 project and custom build script
  3. Developing AEM components using Angular 2
In this part, we’ll learn about Unit testing Angular code base and collecting code coverage matrix.

Unit testing is very important from code quality perspective and it becomes even more important when project is large and multiple teams are working on same project. In large projects, dividing large systems in to small and loosely coupled modules is a best practice but then, it becomes very critical to test each of these modules independently without explicit depending on each other. This is very critical from unit testing perspective.

In AEM, we have reusable components that can be used to create variety of content and it becomes very important to test each component. Typically, an AEM component have:
  1. User Interface (JSP, Sightly etc.) and JavaScript (Angular.js, React.js etc.)
  2. Some backing object/Sling Model 
  3. And/or WCM Use class (Java or JavaScript based)
In a well-designed and architected application, each of these 3 pieces should be independently unit testable. In this article, we’ll be focusing on testing User Interface i.e. #1

Unit testing UI is simple as compared to developing AEM components using Angular 2. It is simple because, for testing we are not doing anything different just because we are using AEM, testing will be done in usual way as we would do when we are not using AEM.

NOTE: Since we have our custom build script for building project, we won’t be using Angular CLI and standard project structure for testing.

I am going to use same project from my github repository and will add everything that we need for unit testing UI (Angular code).

Before we proceed, let’s refresh few things:
  1. Code for this project is here: https://github.com/suryakand/aem-angular2
  2. In previous articles in this series we learned that we have two separate builds:
  • gulp build” – builds project so that everything can be testing as if it is regular angular application. This build generates artifacts on build folder and you can run following command to run the application “npm run start”
  • gulp build:aem” – build projects in such a ways that everything is packaged as crx package that can be deployed into AEM

Installation

If you have followed first 3 articles in this series, then you should be good with installation. Please refer those articles for installation. Here are links to previous articles:

Testing Frameworks

For UI unit testing we’ll be using:
  1. Jasmine - Jasmine is a JavaScript testing framework that supports a software development practice called Behavior Driven Development, or BDD for short
  2. Karma - Manually running Jasmine tests by refreshing a browser tab repeatedly in different browsers every-time we edit some code can become tiresome. Karma is a tool which lets us spawn browsers and run jasmine tests inside of them all from the command line. The results of the tests are also displayed on the command line. Karma can also watch your development files for changes and re-run the tests automatically. Karma lets us run jasmine tests as part of a development tool chain which requires tests to be runnable and results inspectable via the command line.
  3. Istanbul - a JS code coverage tool written in JS. 

Project Configuration

To write and execute we need to add/update some configuration files. In this section we’ll see go over these changes.

  • include UI unit testing and test runner dependencies in package.json file and “run npm” install again

  • Add a task to run unit test

2. Add karma runner configuration (karma.conf.js) – Complete configuration file is here https://github.com/suryakand/aem-angular2/commit/316bdc3e2426f74c2d42a6d8339753f3cda4d485#diff-a068ef752f58b4eda47e5254ca70802d

This file, defines where to look for JS files that needs to be considered for testing, which testing framework to used for testing, what reporters (code coverage framework) to use and what port Karma should run etc.

Write Unit Test Cases

I have added a new folder called “ui.tests“ where we’ll write all test cases for our Angular code. New folder structure looks like this



From Angular perspective, there are 2 main entities for which we need to write test cases:
  1. Angular Components
  2. Angular Services
Let’s write a test case for Angular component. For this article we’ll write a very basic test case for “about.component”. You can refer full example of “about.component” test case here https://github.com/suryakand/aem-angular2/blob/master/ui.apps/src/main/content/jcr_root/apps/ngaem/components/ui.tests/components/about.component.spec.ts

Here are basic steps involved in writing a test case:
  1. Import mock test dependencies, components and services needed for testingImage may be NSFW.
    Clik here to view.
  2. Initialize mock Angular environment/TestBed (as if we do in with real application in browser) by loading/initializing angular components and services. This mock angular environment initialized for testing is referred as TestBed. To simplify this I have created a utility class “UtilTestBed”. You can use this class initialize mock TestBed in one line, see line# 19 belowImage may be NSFW.
    Clik here to view.
  3. Write logic/asset statements to test components. In this step, we create an instance of component that we want to test and verify that it has been rendered properly. Verification can be done in various ways e.g. verify that expected HTML element is rendered (line# 30 below), text/value of rendered element (line# 37 below)Image may be NSFW.
    Clik here to view.

Same method can be followed for writing test case for angular services too.

Once you have writing test cases, you can execute them using following command:

npm run test

You should see following output after execution is completed:
Image may be NSFW.
Clik here to view.

Verifying Code Coverage Report

Once all test cases are finished, navigate to the “/aem-angular2/ui.apps/target/reports/coverage/report-html” folder and open index.html to see code coverage report. The report would look like this:

I hope this article will help you to implement unit tests for you front end code and will help you to maintain your code quality.

Thanks for reading!!!

Reference to other articles in this series:

Generating URL based on AEM RUN mode using AEM Externalizer Service

AEM allows us to create content for various channels. In large enterprise ecosystem AEM is central system to deliver content to mobile, web, email, big screens and devices like echo show. Same content can be rendered in different ways and formats and that is a big advantage of AEM.

In this article, we are going to look at one specific challenge that becomes very visible when AEM delivers content via email, mobile etc. The challenge is making sure that links embedded in emails and content delivered via email are not broken. There are various ways in which you can fix this problem:
  1. Dynamically determine the host URL based on incoming request (sling request and resolver) but, this is not possible in all situations (especially when original request is not originated from a resource/page served by same AEM instance)
  2. Create your own OSGi configuration to store AEM instance host domain. This needs to be different for Author and Publisher
  3. Hardcoded URLs … a big NO for this

Haven’t AEM team thought about this? Yes, they did and if we want to solve this problem in AEM way then we should AEM Externalizer Service.

Externalizer is an OSGI service programmatically transform a resource path (e.g. /content/mysite/mypage) into an external and absolute URL (for example, http://www.site.com/content/mysite/mypage) by prefixing the path with a pre-configured DNS. Author and Publish mapping is done for you but, we can map other custom URLs based on run mode/host DNS and while building URLs we can use appropriate configuration based on run mode of server to generate external URLs.

In this article, we are going to create a simple OSGi service that can be used to generate a link for external content delivery.

Step 1: Configure AEM with URLs for all run modes
  1. Navigate to system console by entering URL http://:/system/console/configMgr
  2. Open “Day CQ Link Externalizer” configuration, this will look like
Image may be NSFW.
Clik here to view.

  1. Define a domain mapping for custom domains (if needed)
  2. Save changes

Step 2: Create sample OSGi service (interface and implementation)
Create a simple interface for sample OSGi service that we will implement


Implement ExternalLinkBuilderService interface


Step 3: Inject service created in Step# 2 in other classes
Now we can inject this service into other OSGi components (using @Reference) and call method buildExternalLink() with internal path and it’ll return an absolute link based on run mode

Small but a very useful concept. I hope you have enjoyed reading this. Feel free to post your comments.

References:
Externalizer Documentation & API usage: https://helpx.adobe.com/experience-manager/6-3/sites/developing/using/reference-materials/javadoc/com/day/cq/commons/Externalizer.html


Part 1: Lazybones Generator for AEM

1.Introduction

This is first part of multi-part series where we'll talk about how Lazybones is useful in AEM projects.

Starting new project from scratch/base is an exciting task and we want to give our best whenever we are starting something new. We try to pour all our past experiences and learnings together to make sure that we are not doing same mistakes that we have done or found in other projects.

Developing large projects goes through various phases:

1)Requirement gathering and analysis
2)Create high level and low level architecture and technical design
3)Choosing frameworks/libraries for development
4)Choosing unit testing frameworks
5)Hooking up build scripts
6)Choosing dependency management tool
7)Defining DevOps strategy
8)Infrastructure – Cloud or In House
9)And many more….

Individual people might have worked on different projects and each of them might have faced different problems and might have learned many best practices. It’s quite possible that when you are starting new project either you don’t know best practices that other individuals might have learned (and you’ll not get benefit from their learning) or you are working on completely new technology and have no idea where and how to start. How about a tool which will allows you to create a ready to go boilerplate project with all best practices?

In this article, we are going learn about one such tool (i.e. Lazybones) and will use to simplify challenges that we have with integrating AEM and Angular 5 together. Lazybones can do much more but, in this article our focus will be on #3, 4, 5 and 6. Lazybones can be extended to handle other AEM related challenges, this is just a sample use case that we have picked for illustrating benefits of using Lazybones with AEM.

2.AEM with Angular 5 project challenges

Adobe Experience Manager (AEM), is a content management platform for building websites, mobile apps and forms - making it easy to manage your marketing content and assets. AEM empowers business users and non-technical staff to create and manage content as per business needs.

Angular 5 is modular in nature, what that means is there are number of smaller modules that you can choose to mix and match based on what is required in your project. When you are developing Angular 5 application outside of AEM this is not a big challenge in terms of building and shipping an Angular 5 application out of the door because there are many build tools (e.g. Angular 5 CLI) are available to take care of managing these smaller angular modules and dependencies. But, when we are in AEM, components are just building blocks and content authors can drag-drop these components anywhere and as many times as they need so, reusability is key i.e. authors should be able to use Angular 5 components in same way as regular AEM component. So, this mean build tools that are available today for Angular 5 application can’t help because those tools need to know upfront where components have been used and then these tools compiles applications but, in AEM this is different (as we discussed above).

To address this problem, we need to write complex build script, hooks up right dependencies (both for AEM and Angular) and integrate unit testing libraries.

Follow these links to read more about challenges in details: http://suryakand-shinde.blogspot.com/2017/06/part-1-angular-2-with-aem-introduction.html

3.Lazybones

3.1.What is Lazybones

Definition from Lazybones repository page:
It allows you to create a new project structure for any framework or library for which the tool has a template. You can even contribute templates by sending pull requests to this GitHub project or publishing the packages to the relevant Bintray repository.
The concept of Lazybones is very similar to Maven archetypes, and what Yeoman does for web applications. Lazybones also includes a sub-templates feature that resembles the behavior of Yeoman's sub-generators, allowing you to generate optional extras (controllers, scaffolding etc.) inside a project.

3.2.Why Lazybones

Often when we are working on a project we spend a lot of time to create project foundation, folders, base skeleton of project, hookup build scripts, add right dependencies and other DevOps related tools/configurations and this takes a lot of time. If you look for solutions, you will quickly find that people have solved this problem using tools like:

If you are working on projects that have different needs from build, code organization and packaging perspective you’ll realize that standard maven archetypes available publically will not suffice your needs. This is where Lazybones capabilities will help. In next few sections we learn:
  • Installing Lazybones
  • What is Lazybones
  • How to create a Lazybones projects template to generate an AEM + Angular 5 project
  • How to use template
  • Use cases for Lazybones
  • Best practices

3.3.Installing Lazybones

The easiest way install Lazybones is with sdkman.
Step 1:  Installation and validation of sdkman
  • Open terminal and type command curl -s "https://get.sdkman.io" | bash
  • Quit and reopen terminal
  • Validate installation by typing command sdk version in terminal


Fig: Eclipse Gradle project import modal

Step 2:  Installation and validation of lazybones

  • Open terminal and type command sdk install lazybones

For more information about installation, please visit https://github.com/pledbrook/lazybones

4.Lazybones Use Cases and Advantages

We saw project setup challenges, available options and learned about Lazybones templates. In this section, we’ll see where should we use it and what are advantages of using it?

4.1.Use Cases

  • Project is composed of various technologies and frameworks
  • Project is complex to hand craft and it takes a lot of time to setup new project
  • Project that have large set of configuration that are difficult to remember while creating project
  • Projects that need same project structure, configurations and setup to be repeated multiple times
  • Project in which traditional scaffolding mechanisms of setting up project (e.g. Maven archetype) are insufficient
  • Project which needs customized build script which is difficult to hand craft every time

4.2.Advantages

  • Less time to setup new project
  • No need to worry about complex configurations, build process etc.
  • Standardized way of creating project and other artifacts (components, templates, services etc.) using best practices that are baked into Lazybones templates. 
  • Individual developers don’t need to worry about complexities and don’t need to spend time writing boilerplate code


Maven profiles – Operating Systems (OS) specific build profiles

Maven is a powerful build framework and it allows us to customize every aspect of build with simplicity.

In this article, we’ll learn about how to create different build profiles for windows and Mac OS. The same concept can be replicated to other operating systems or environment specific builds.

Let’s define a use case.

Let’s say we have project in which we want to compile and package Angular 2+ project using maven build. Angular 2+ project is dependent on Node.js (npm) and Angular CLI (ng CLI). We can use Maven Exec plugin for invoking npm and ng CLI commands.

On Mac OS “npm” and “ng” commands works without any additional extension. To run same commands on Windows operating system we need to append .cmd extension. This means if we have mixed folks in development team who uses both Windows and Mac OSx then every time pom.xml file needs to be updated. What if someone pushes this file to central code repository? This will obviously break the builds on CI/CD (e.g. Jenkins server). This is just one scenario, there may be other use cases where build command/process has to be different based on operating systems or environment and it is not good idea to modify pom.xml file locally for build and leave room for accidental commits/push. This is where Maven profiles comes handy.

Let’s continue with same Angular 2+ build example. On Mac OS, the build section (that is responsible for building Angular project using Maven exec plugin) of pom.xml file would look like:


<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>

<!-- Omitted remaining pom.xml for simplicity -->

<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<!-- Execute UI Build as part of standard Maven "generate-resource" phase -->
<execution>
<id>ui-npm-install</id>
<configuration>
<workingDirectory>.</workingDirectory>
<executable>npm</executable>
<arguments>
<argument>install</argument>
</arguments>
<workingDirectory>${basedir}</workingDirectory>
</configuration>
<phase>validate</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
<execution>
<id>ui-aem-build</id>
<configuration>
<workingDirectory>.</workingDirectory>
<executable>ng</executable>
<arguments>
<argument>build</argument>
<argument>--prod</argument>
<argument>--outputHashing</argument>
<argument>none</argument>
</arguments>
<workingDirectory>${basedir}</workingDirectory>
</configuration>
<!--<phase>prepare-package</phase> -->
<phase>generate-sources</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
<execution>
<id>npm-clean</id>
<configuration>
<workingDirectory>.</workingDirectory>
<executable>rm</executable>
<arguments>
<argument>-rf</argument>
<argument>node_modules</argument>
</arguments>
<workingDirectory>${basedir}</workingDirectory>
</configuration>
<!--<phase>prepare-package</phase> -->
<phase>install</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
</plugin>

<!-- Omitted remaining pom.xml for simplicity -->
</project>


Look at the line # 18, 33, 52 and 54. These commands are OS dependent and if we run a maven build with this configuration on Windows, it’ll fail.

To handle this in Maven way, we could create separate profile for Max OS and Windows. Following pom.xml file shows an example of OS specific build profiles.

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<!-- Omitted remaining pom.xml for simplicity -->
<profiles>
<!-- Max OSX specific build profile -->
<profile>
<id>macOSBuild</id>
<activation>
<!-- Profile is active by default -->
<activeByDefault>true</activeByDefault>
</activation>
<build>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<!-- Execute UI Build as part of standard Maven "generate-resource" phase -->
<execution>
<id>ui-npm-install</id>
<configuration>
<workingDirectory>.</workingDirectory>
<executable>npm</executable>
<arguments>
<argument>install</argument>
</arguments>
<workingDirectory>${basedir}</workingDirectory>
</configuration>
<phase>validate</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
<execution>
<id>ui-aem-build</id>
<configuration>
<workingDirectory>.</workingDirectory>
<executable>ng</executable>
<arguments>
<argument>build</argument>
<argument>--prod</argument>
<argument>--outputHashing</argument>
<argument>none</argument>
</arguments>
<workingDirectory>${basedir}</workingDirectory>
</configuration>
<!--<phase>prepare-package</phase> -->
<phase>generate-sources</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
<execution>
<id>npm-clean</id>
<configuration>
<workingDirectory>.</workingDirectory>
<executable>rm</executable>
<arguments>
<argument>-rf</argument>
<argument>node_modules</argument>
</arguments>
<workingDirectory>${basedir}</workingDirectory>
</configuration>
<!--<phase>prepare-package</phase> -->
<phase>install</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
</plugin>
</build>
</profile>

<!-- Windows OS specific build profile -->
<profile>
<id>winOSBuild</id>
<activation>
<!-- Profile is NOT active by default -->
<activeByDefault>false</activeByDefault>
</activation>
<build>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<!-- Execute UI Build as part of standard Maven "generate-resource" phase -->
<execution>
<id>ui-npm-install</id>
<configuration>
<workingDirectory>.</workingDirectory>
<executable>npm.cmd</executable>
<arguments>
<argument>install</argument>
</arguments>
<workingDirectory>${basedir}</workingDirectory>
</configuration>
<phase>validate</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
<execution>
<id>ui-aem-build</id>
<configuration>
<workingDirectory>.</workingDirectory>
<executable>ng.cmd</executable>
<arguments>
<argument>build</argument>
<argument>--prod</argument>
<argument>--outputHashing</argument>
<argument>none</argument>
</arguments>
<workingDirectory>${basedir}</workingDirectory>
</configuration>
<!--<phase>prepare-package</phase> -->
<phase>generate-sources</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
<execution>
<id>npm-clean</id>
<configuration>
<workingDirectory>.</workingDirectory>
<executable>rmdir</executable>
<arguments>
<argument>/s</argument>
<argument>node_modules</argument>
</arguments>
<workingDirectory>${basedir}</workingDirectory>
</configuration>
<!--<phase>prepare-package</phase> -->
<phase>install</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
</plugin>
</build>
</profile>
</profiles>
<!-- Omitted remaining pom.xml for simplicity -->
</project>

Few important observations

·      We have moved the plugin “exec-maven-plugin” configuration inside XML tag
·      Created two profiles “macOSBuild” and “winOSBuild” (line 9 and 78 respectively)
·      macOSBuild” profile is active by default (line 12). This means on Mac OS, we can simply run the “mvn clean install” command and Mac profile will be automatically picked up for build
·      To execute build on Windows OS, we need to explicitly tell maven to use “winOSBuild” profile. This can be done via “mvn clean install -PwinOSBuild

I hope this simple concept will help you to avoid manual changes of pom.xml and keep your project away from accidental risk of build failures.

AEM – Create Maven project using archetype


To create a simple AEM project using Maven you can use latest archetype from Adobe. Have a look at the GitHub project https://github.com/adobe/aem-project-archetype

To create new project use following command:


mvn archetype:generate -DarchetypeGroupId=com.adobe.granite.archetypes -DarchetypeArtifactId=aem-project-archetype -DarchetypeVersion=19 -DoptionIncludeExamples=y -DoptionIncludeErrorHandler=y

This will ask series of questions/inputs while generating project e.g. apps folder name, site name etc.


Once project folder structure is generated you can use the following command to build and install it on locally running AEM instance using following command:

mvn clean install -PautoInstallPackage

NOTE: Make sure that your AEM instance is running and it is running on port 4502. If you want to change host or port, please refer to pom.xml (available in root) of the generated project.


AEM as a Cloud Service (AEMaaCS) – Quick Introduction

AEM as a Cloud Service (AEMaaCS) – Introduction

If you are working in content management space, you must have heard of Adobe Experience Manager (AEM, previously known as CQ5). It is one of the core products offered by Adobe as part of their Adobe Experience Cloud. AEM has evolved over the time from all perspectives (features, maturity, stability, integration with other products and deployment model).

 

As of today, organizations have 3 options to deploy AEM:

  1. AEM on premisses
  2. AMS – Adobe Managed Service (mainly hosted on AWS)
  3. AEM as a Cloud Service (AEMaaCS) 

Focus of this article will be on AEM as a Cloud Service (AEMaaCS). Adobe Experience Manager as a Cloud Service is a new generation software-as-a-service for AEM by Adobe.

 

AEMaaCS is based on the leading solution Adobe Experience Manager (AEM) and offers outstanding content management (CMS) capabilities and digital asset management (DAM) for marketing and communication teams.

 

The solution has been entirely designed for the cloud and is scalable, secure, always available, and up to date.

 

What is AEM as a Cloud Service?

In very simple terms consider that old classic AEM was a big monolithic application and had limitations (e.g., scalability, extensibility) like any other legacy applications that are not cloud native. AEM as a Cloud Service is a result of refactoring of monolithic AEM application into a set of modular components/services that are cloud native. Because of this refactoring, AEM as a Cloud Service inherits benefits of cloud like scalability, agility, extensibility etc. Many core concepts like replication, asset computing/processing, repository services etc. have been changed as part of AEMaaCS offering. We'll look at each one of these concepts in details in upcoming articles.


Adobe Experience Manager (AEM) as a Cloud Service is the latest offering of the AEM product line, helping you continue to provide your customers with personalized, content-led experiences. It provides cloud-native agility to accelerate time to value and is extensible to meet your unique business requirements. You can build on past investments and innovations by preserving and extending all your use cases and functionalities.

AEM as a Cloud Service lets you capitalize on the AEM applications in a cloud-native way, so that you can:

  • Scale your DevOps efforts with Cloud Manager: CI/CD framework, autoscaling, API connectivity, flexible deployment modes, code quality gates, service delivery transparency, and guided updates.
  • Enable developers to add automation to application development practices.
  • Deliver content quickly and efficiently on a global scale, using a built-in Content Delivery Network (CDN) and other network-layer best practices.
  • Leverage a dynamic architecture that auto-scales, thus removing infrastructure considerations.
  • Stay on top of threats and security-risk mitigation, using automated tests to scan for common vulnerabilities.
  • Ensure maximum resilience and efficiency backed by optimized performance topologies.
  • Take advantage of AEM as a Cloud Service’s deep integration with the Adobe Experience Cloud to provide better customer experiences with online marketing and web analytics products.
  • Utilize tools that help accelerate the migration tasks, such as code refactoring, transfer of content, and more.

Typical AEM as a Cloud Service environment

 A new project that is getting onboarded on AEMaaCS will be provisioned under a Program. There are three types of environments available with a Program of AEM as a Cloud Service:

  • Production environment: hosts the applications for the business practitioners.
  • Stage environment: is always coupled to a single production environment in a 1:1 relationship. The stage environment is used for various performance and quality tests before changes to the application are pushed to the production environment.
  • Development environment: allows developers to implement AEM applications under the same runtime conditions as the stage and production environments.


Figure 1: AEM as a Cloud Service (simplified component view)

 

An AEM program is the container that includes following apart from environments that we saw above:






Any new AEM project is always bound to exactly one specific codebase, where you can store both configuration and custom code for your project. This information is stored in a code repository, accessible via the usual Git clients, made available to you at the time new programs are created.

  • AEM Cloud Sites Service
  • AEM Cloud Assets Service

Both of these allow access to a number of features and functionalities. The author tier will contain all Sites and Assets functionality for all programs, but the Assets programs will not have a publish tier, nor a preview tier, by default.

 

Core benefits of AEM as a Cloud Service:

  • It is always on with zero downtime
  • It is always at scale
  • It is always current with latest features/upgrades
  • It is always evolving (Adobe is adding new set of standards and best practices constantly, those are by default included automatically)
  • Low cost of ownership
  • Usage based license model
  • More secure as it is always on the latest security level

Why AEM as a Cloud Service?

Before we answer why AEM as a Cloud Service, we need to look at the expectation of consumers and businesses in today’s era. Also, we need to look at the limitations/challenges with either AEM classic/on-premisses or AMS.

 

At high-high customers want:

  • Better experience (personalization, just in time experience)
  • Relevant content/information
  • Fast and seamless experience

On the other hand, businesses expectation is:

  • Customer satisfaction
  • Lower cost of delivery
  • Self-resilient applications/IT infrastructure
  • Modernized and scalable applications

From business perspective, let’s also look at quick comparison of 2 modes in which AEM was offered (before AEMaaCS):

Image may be NSFW.
Clik here to view.


Also, there were challenges that exists in both offerings (above):

  • Scalability limitation because of the way Oka/JCR repository works in classic AEM
  • Computational limitations (e.g., asset processing and rendition generation)
  • Content replication related issues (performance, reliability etc.) 

Some of these limitations (highlight) directly translates into a need of cloud native solution so that AEM can be scaled, inherit the security benefits of cloud along with lower cost of ownership and therefore Adobe came up with AEMaaCS offering.

 

A lot has been changed (for good reasons) in AEMaaCS! I’ll cover all of that in more details in my other articles (coming up soon!). For now, consider that old classic AEM has been refactored into following:

  • A containerized architecture for scalability and to make AEM cloud-native
  • Set of smaller and scalable services (repository service, asset computer service etc.) for extensibility and performance improvements 

In my upcoming articles, I’ll discuss in detail about:

  • Architecture and core components of AEM as a Cloud Services
  • AEM as a Cloud Service and DevOps
  • AEM as a Cloud Service and Dispatcher
  • AEM SDKK for AEM as a Cloud Service
  • How developers need to adapt to work with AEM as a Cloud Service
  • Migration from existing (AEM on-premisses or AMS) to AEM as a Cloud Service

 

AEM as a Cloud Service (AEMaaCS) – Architecture Overview



AEM as a Cloud Service (AEMaaCS) – Architecture


Adobe Experience Manager (AEM) is one of the leading CMS from Adobe and is part of Adobe Experience Cloud (AEC). The Adobe Experience Manager (AEM) web content management offers a set of capabilities for creating, managing, delivering, and personalizing content across various digital marketing channels, including web, mobile, and email. 

Before we dive into architecture of AEM as a Cloud Service (AEMaaCS) we need to understand why AEMaaCS is needed and how it is different as compared to classic AEM. To understand this, we’ll first look at:

  • Evolution/history of AEM
  • Challenges with traditional/classic AEM (the AEM that majority of us have used from last 10-12 years)

Evolution/History of AEM


The software was originally launched in early 2000 as Communiqué (CQ) by Day Software of Basel, Switzerland. After CQ 5.3 had been released, Day Software was acquired by Adobe Systems in 2010 then CQ became Adobe Experience Manager.

From technology perspective AEM has evolved (mainly in last 5 years). Year 2020 is when AEM came out as a Cloud Service first time, allowing for additional scalability, faster updates, and consistent accessibility.


Image may be NSFW.
Clik here to view.
Timeline  Description automatically generated

Figure 1: Evolution of AEM as a Cloud Service

 

Challenges with classic AEM

Let’s quickly look at the components of classic AEM application to understand the challenges with old AEM.

Image may be NSFW.
Clik here to view.
Top 35+ Most Asked AEM Interview Questions and Answers (2022) - javatpoint

Figure 2: Traditional AEM Architecture (high-level)


As you can see above, AEM is combination of many frameworks (Felix, OSGi, JCR/Oka repository and many other OSGi modules). Important thing to note is, all these modules are running on single JVM. This is a big limitation in terms of performance and scalability of AEM. Apart JVM, traditional AEM also suffered from issues related to JCR repository, asset ingestion, replication related and other issues.


Here are some key challenges with traditional AEM:

  • A typical classic/old AEM instances (author or publish) runs on single JVM instances along with all OSGi modules and supporting components
  • Oka/JCR repository related limitation adds to scalability issues (slow I/O operation under load, number of parent/child nodes, performance degrade because concurrent user access)
  • Computational limitations (e.g., asset processing and rendition generation)
  • Content replication related issues (performance, reliability etc.)

At high level these issues are very common and exists with any traditional monolithic applications. Because of these issues and current architecture AEM was not cloud ready/native and was not capable of scaling dynamically. Where there are challenges there are needs. So, the need was to make AEM cloud-native and hence AEMaaCS came in to picture.


These issues and limitation led to refactoring and architectural changes of traditional AEM to make it cloud-native and scalable. Obviously, there are other benefits as well, but we’ll focus on architectural aspect only in this article.


Please note that, from an end user (mainly developers and authors) perspective nothing has changed significantly, these refactoring changes are very much transparent for developers and authors. Developers and authors will continue to use the same tools and process for development and content creation/publishing. Also, AEMaaCS does not disrupt existing on-premises and AMS deployments.


AEM as a Cloud Service Architecture


If you are following along this article, you must have seen the high-level challenges with traditional AEM (above). Before we deep dive into the architectural discussion, I want to describe in very simple terms what “AEM as a Cloud Service” is so that it is easy for you to follow rest of this article:

 

“In very simple terms consider that traditional AEM was a big monolithic application and had limitations (e.g., scalability, extensibility) like any other legacy applications. AEM was not cloud-native and hence we were missing benefits of cloud in AEM. AEM as a Cloud Service is a result of refactoring of traditional monolithic AEM application into a set of modular components/services that are cloud-native, and these components can be scaled dynamically in cloud.


Now since we have the context and background about challenges with traditional AEM, let’s look at the various components of the AEMaaCS architecture. Also, we’ll see how it addresses the challenges that are described above in this article. 


Below are key components of AEMaaCS:

 

Image may be NSFW.
Clik here to view.
Graphical user interface  Description automatically generated

Figure 3: AEM as a Cloud Service Architecture

  • Content Repository Service
  • Asset Computer (Microservices) Service
  • Container Orchestration Service
  • Cloud Manager – as the name “manager” suggests 
  • Replication Service

Content Repository Service


AEMaaCS Components

Component Detail

What challenge does it solves?

Content Repository Service

In AEMaaCS, author nodes are designed for high availability with complete separation between the code and content/asset. As illustrated below (in Figure 4), these nodes are connected to the Content Repository Service that stores the structured content and assets in a separate store outside AEM of instances, while all binaries (e.g., based AEM jar, patches etc.) are separately stored in a blob store. This allows for product updates to happen without any downtime or interruption to content creators. While new nodes are getting updated with latest features, existing nodes on previous version keep running in the background. As soon as the new nodes are ready, they get connected to the content repository service (shared between all author nodes) and start receiving serving requests. At this point, old nodes are retired. This redundancy helps ensure that the system is always up to date with no interruption to service.

In a traditional dedicated hosted environment, the resources (e.g., Oka repository throughput, JVM) available to the system are static and limited, requiring infrastructure planning for compute capacity and memory to ensure sufficient capacity. However, predicting traffic is challenging and can put the consumer experience at risk, which leads many brands to overestimate capacity, hence increasing total costs. 

 

Now with this centralized Content Repository Service, we can just define the performance SLAs and let AEMaaCS deal with scaling AEM instances up/down based on traffic load. We don’t need to worry about how content will be replicated to newly created AEM instances because AEMaaCS will take care of it for us.

  

Image may be NSFW.
Clik here to view.
Diagram  Description automatically generated

 Figure 4: AEMaaCS Content Repository Service


Asset Computer (Microservice) Service


AEMaaCS Components

Component Detail

What challenge does it solves?

Asset Computer Service

In AEMaaCS, users upload assets/binaries directly to a partition in the cloud storage container. This container is dedicated for storing assets and is separated from Oka repository.

As illustrated in below figure, when an authorized user initiates an asset upload request, Experience Manager is notified to trigger optimization of partitioning, followed by allocation of a private location in the binary cloud storage container. This partitioning of high-quality and large-size content into smaller and manageable chunks, instead of handling one big file, is key to significantly speeding up the ingestion process. Once all the parts of the asset have been uploaded to the storage container, Experience Manager is notified of its completion, allowing it to register the new asset in its repository. 

Once assets are ingested, they need to be rendered and processed before they can be used widely. Rendering and processing generally entail a series of back-end operations such as extraction of rich XMP metadata, usability of text to power search, creation of thumbnails, Sensei-powered smart tagging, Photoshop imaging engine and web renditions to provide previews of the assets, custom image renditions, and limited video transcoding. This is the job that is done efficient by Asset Microservices.

Depending on the volume and quality of assets, ingestion operation can slow down an author with highly demanding processing requests. 

Traditionally, large asset processing has required IT teams to plan for extra compute capacity depending on predictable seasonal demands, such as preparation by marketing teams for product launches or holiday season sales. This not only results in overestimated capacity but also leaves assets vulnerable in unpredictable times.

 

Asset processing in Cloud Service uses a microservice architecture to carry out this demanding operation outside the Experience Manager environment, saving resources and capacity planning. 

  


Figure 5: AEMaaCS Asset Computer and Microservices

 

Container Orchestration Service


AEMaaCS Components

Component Detail

What challenge does it solves?

Container Orchestration Service

In AEMaaCS, author, publish and dispatcher nodes are nothing but containers or images. It is important to note that the container itself is immutable and is combination of binaries (e.g., AEM jar) and custom code/content (code, configuration). This immutable nature of container allows AEMaaCS to scale dynamically.

These contains are used to spin up new instances as and when AEMaaCS need to scale up. Container Orchestration Service is an integral part of Cloud Manager which communicate with container manager to create and kill containers as needed. Container Orchestration Service and Cloud Manager works together.

Traditional AEM was not scalable dynamically. IT team had to estimate for capacity upfront or had to add scale the instances manually (by adding new nodes). Note that apart from manual interventions the process of setting up the new instance involved with content replication, dispatcher installation, configuration and much more.

In AEMaaCS, every AEM nodes is an immutable container/image (made up of code + configuration + base AEM jar). Container Orchestration Service uses these containers to scale AEM dynamically in cloud (based on events trigged by Cloud Manager).

  

Cloud Manager


AEMaaCS Components

Component Detail

What challenge does it solves?

Cloud Manager 

In simple terms consider that Cloud Manager is your liaison to your IT team. Most of the tasks that traditionally DevOps and Development teams used to do are done by Cloud Manager.

Cloud Manager’s key capabilities:

  • Continuous Integration/Continuous Delivery (CI/CD) of code
  • API Connectivity to complement existing DevOps
  • Code Inspection, performance testing, and security validation based on best practices before pushing to production to minimize production disruptions
  • Autoscaling feature intelligently detects the need for increased capacity and automatically brings online additional Dispatcher/Publish segment(s). For this task Cloud Manager works with Container Orchestration Service

In a traditional AEM installation IT teams manage the AEM infrastructure, DevOps (CI/CD), release management, configuration. Development team’s responsibility is to ensure that they are following best practices to write secure and scalable code. Many of these tasks were redundant and error prone. Even in some cases organizations do not even have a robust DevOps and Development standards in place. The reason could be budget, lack of skilled resources or anything else. 

Cloud Manager takes away the burden of IT teams by offering service implicitly. Following teams can focus on innovation rather than doing trivial tasks:

  • Infrastructure team 
  • DevOps team
  • Development team

 

 

Image may be NSFW.
Clik here to view.
Diagram  Description automatically generated

 

 Figure 6:AEM as a Cloud Service - Cloud Manager and CI/CD


Replication Service


AEMaaCS Components

Component Detail

What challenge does it solves?

Replication Service

In AEMaaCS, the author tier, the preview tier, and the publish tier read and persist content from/to a Content Repository Service. 

Replication in AEMaaCS is handled differently as compared to traditional AEM implementation. Traditional replication agents are NOT used in AEMaaCS, rather an event-based services mechanism is used to replicate content from author to multiple publish instances. In this even based replication architecture, both author and publish instances are not aware of each other and hence we don’t need to configure replication agents. In AEM Cloud Service, content is published using Sling Content Distribution

When content is approved from the author tier, this is an indication that it can be activated, therefore pushed to the publish tier persistence layer, or optionally to the preview tier. This happens via the Replication Service, a middleware pipeline. This pipeline receives the new content, with the individual publish service (or preview service) nodes subscribing to the content pushed to the pipeline. 

In traditional AEM installation, replication agents are responsible for replicating content from author to one or more publish instances. This replication from author to publish instances is handled via a Sling Jobs. Based on number of publish instances (in publish farm) the author instance gets busy (processing Sling jobs) and overloaded to publish content to every instance in publish farm, this is one challenge. 

Another challenge with traditional AEM replication is author should be made aware of publish instance by configuring replication agent. This agent configuration process needs to be repeated every time a new publish node is added in publish farm. To scale AEM dynamically in cloud this manual step must be removed so that Cloud Manager can spin up new containers based.

 

Since in AEMaaCS architecture replication is handled via an event-based Replication Service we don’t need to configure agents manually. Also, the replication jobs that degrades author instance performance are not executed on author, rather it is handled by an external Replication Service.

 

I hope above explanation provides you with a clear picture of key differences/changes between traditional AEM and AEMaaCS. Why these changes were need and how it helps/addresses challenges that we have with a traditional AEM installation.


In this article I just wanted to provide a very high-level overview of AEMaaCS. I’ll try to cover each of these topics in more details.


Also, I wanted to say this again, from an end user (mainly developers and authors) perspective nothing has changed significantly, these changes are very much transparent for developers and authors. Developers and authors will continue to use the same tools and process for development and content creation/publishing. AEMaaCS provides a secure, always up to date and robust AEM instances infrastructure baked with best practices guardrail so that team can focus more on innovation and digital experience design/delivery.

 

Viewing all 63 articles
Browse latest View live