Drupal Planet

Subscribe to Drupal Planet feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 21 hours 47 min ago

PreviousNext: Encrypted Drupal Database Connections with Amazon RDS

August 8, 2018 - 14:46

Malicious users can intercept or monitor plaintext data transmitting across unencrypted networks, jeopardising the confidentiality of sensitive data in Drupal applications. This tutorial will show you how to mitigate this type of attack by encrypting your database queries in transit.

by Nick Santamaria / 8 August 2018

With attackers and data breaches becoming more sophisticated every day, it is imperative that we take as many steps as practical to protect sensitive data in our Drupal apps. PreviousNext use Amazon RDS for our MariaDB and MySQL database instances. RDS supports SSL encryption for data in transit, and it is extremely simple to configure your Drupal app to connect in this manner.

1. RDS PEM Bundle

The first step is ensuring your Drupal application has access to the RDS public certificate chain to initiate the handshake. How you achieve this will depend on your particular deployment methodology - we have opted to bake these certificates into our standard container images. Below are the lines we've added to our PHP Dockerfile.

# Add Amazon RDS TLS public certificate.
ADD https://s3.amazonaws.com/rds-downloads/rds-combined-ca-bundle.pem  /etc/ssl/certs/rds-combined-ca-bundle.pem
RUN chmod 755 /etc/ssl/certs/rds-combined-ca-bundle.pem

If you use a configuration management tool like ansible or puppet, the same principal applies - download that .pem file to a known location on the app server.

If you have limited control of your hosting environment, you can also commit this file to your codebase and have it deployed alongside your application.

2. Drupal Database Configuration

Next you need to configure Drupal to use this certificate chain if it is available. The PDO extension makes light work of this. This snippet is compatible with Drupal 7 and 8.

$rds_cert_path = "/etc/ssl/certs/rds-combined-ca-bundle.pem";
if (is_readable($rds_cert_path)) {
  $databases['default']['default']['pdo'][PDO::MYSQL_ATTR_SSL_CA] = $rds_cert_path;
}
3. Confirmation

The hard work is done, you'll now want to confirm that the connections are actually encrypted.

Use drush to smoke check the PDO options are being picked up correctly. Running drush sql-connect should give you a new flag: --ssl-ca.

$ drush sql-connect

mysql ... --ssl-ca=/etc/ssl/certs/rds-combined-ca-bundle.pem

If that looks OK, you can take it a step further and sniff the TCP connection between Drupal and the RDS server.

This requires root access to your server, and the tcpflow package installed - this tool will stream the data being transmitted over port 3306. You are wanting to see illegible garbled data - definitely not content that looks like a SQL queries or responses!

Run this command, and click around your site while logged in (to ensure minimal cache hits).

$ tcpflow -i any -C -g port 3306

This is the type of output which indicates the connection is encrypted.

tcpflow: listening on any

x1c
"|{mOXU{7-rd 0E
W$Q{C3uQ1g3&#a]9o1K*z:yPTqxqSvcCH#Zq2Hf8Fy>5iWlyz$A>jtfV9pdazdP7
tpQ=
i\R[dRa+Rk4)P5mR_h9S;lO&/=lnCF4P&!Y5_*f^1bvy)Nmga4jQ3"W0[I=[3=3\NLB0|8TGo0>I%^Q^~jL
L*HhsM5%7dXh6w`;B;;|kHTt[_'CDm:PJbs$`/fTv'M .p2JP' Ok&erw
W")wLLi1%l5#lDV85nj>R~7Nj%*\I!zFt?w$u >;5~#)/tJbzwS~3$0u'/hK /99.X?F{2DNrpdHw{Yf!fLv
`
KTWiWFagS.@XEw?AsmczC2*`-/R rA-0(}DXDKC9KVnRro}m#IP*2]ftyPU3A#.?~+MDE}|l~uPi5E&hzfgp02!lXnPJLfMyFOIrcq36s90Nz3RX~n?'}ZX
'Kl[k{#fBa4B\D-H`;c/~O,{DWrltYDbu
cB&H\hVaZIDYTP|JpTw0 |(ElJo{vC@#5#TnA4d@#{f)ux(EES'Ur]N!P[cp`8+Z-$vh%Hnk=K^%-[KQF'2NzTfjSgxG'/p HYMxgfOGx1"'SEQ1yY&)DC*|z{')=u`TS0u0{xp-(zi6zp3uZ'~E*ncrGPD,oW\m`2^ Hn0`h{G=zohi6H[d>^BJ~ W"c+JxhIu
[{d&s*LFh/?&r8>$x{CG4(72pwr*MRVQf.g"dZU\9f$
h*5%nV9[:60:23K Q`8:Cysg%8q?iX_`Q"'Oj
:OS^aTO.OO&O|c`p*%1TeV}"X*rHl=m!cD2D^)Xp$hj-N^pMb7x[Jck"P$Mp41NNv`5x4!k1Z/Y|ZH,k)W*Y(>f6sZRpYm
8Ph42K)}.%g%M]`1R^'qh/$3|]]y"zEh0xG(A]-I`MJGU7rKO~oi+K:4M(nyOXnvaWP4xV?d4Y^$8)2WOK,2s]gyny:-)@D*F%}ICT
Tu>ofc)P[DQ>Qn3=0^fuefIm1]-YHq5rx|W(S3:~2&*6!O|DAZWB:#n9|09`I`A3bq@\E\$=/L5VHm)[#|tI"lkuK.u|!2MT/@u7u(S{"H.H'Fh/4kF_2{)Jc9NQ%jA_rI1lH;k'$n~M_%t%y)t!C_4FO?idwMB]t^M::S!a=*Jee<3sgX@)L;zAuTN2}v#K4AX.(`X1<{#

Resources:

Tagged MySQL, TLS

Zhilevan Blog: Fix Drupal Files/Directories permissions by PHP after hacked

August 8, 2018 - 11:18
Last night one of our former company's customer called me and need help to recover their hacked website,  First of all, I install the Hacked module, and check the changed files and recover them, then looking and cleansing some backdoor files which their job is to inject codes for external codes(most of the time, js files to traffic hijacking) to the website.  

Drupal.org blog: What's new on Drupal.org? - July 2018

August 8, 2018 - 03:23

Read our Roadmap to understand how this work falls into priorities set by the Drupal Association with direction and collaboration from the Board and community.

Announcements Git remote URL changes for full projects and sandboxes

Git authentication methods for Drupal.org hosted projects are changing as we approach upgrading our developer tooling stack.

In particular we are:

We have updated the version control instructions for Drupal.org projects, and put a message in our Git server for any user who makes a push using the deprecated format.

For more information, please review: https://drupal.org/gitauth

Reminder: Drupal Europe is coming up soon

Drupal Europe is coming up in less than 40 days! Drupal Europe will be the largest gathering of the Drupal community in Europe and is a reimagining of this important community event as both technical conference and family reunion. The Drupal Association engineering team will be attending to connect with the community, provide updates on Drupal.org, and listen to some of the incredible speakers who will be in attendance.

Join the community in Darmstadt, Germany, from September 10-14, 2018. Make sure to register, book your travel, and secure accommodation: http://drupaleurope.org/

We want your feedback on ideas for Drupal Core

The Drupal Association has proposed several initiatives for Drupal Core - but before they can be officially adopted they need feedback from stakeholders in the community (even if it's just a "+1") and to reach community RTBC. Here are the proposals:

Drupal.org Updates Staff retreat

In July the Drupal Association gathered together in Portland Oregon for our bi-annual staff retreat. At these retreats we discuss the progress made in the last six months, and our prioritization as an organization going into the next six month period.

Hightech industry page launched

Drupal is the CMS of choice for a variety of companies in the high tech space, including organizations like Redhat, Cisco, and Tesla. Whether it is used in a front-facing application, as a decoupled back-end, or for an internal intranet experts in hightech defer to Drupal's example for their needs.

We launched a new industry page featuring these stories from high tech in July.

Drupal.org API updated for security advisories

To improve the automated toolchains built by organizations and individuals in the community to watch for new security advisories, we've updated the Security Advisory API. One of these changes ensures that the full canonical identifier for each advisory is included in the API data, which is a small but valuable change for anyone monitoring the API for advisory information.

Social Media Sharing for Events News

The DrupalCon news feed now includes social media sharing icons, so that you can better promote DrupalCon news and announcements to your networks. Word of mouth has always been a critical part of Drupal's success - so we hope that as featured speakers are announced, early bird registration begins, or the schedule is published, you will help us get the word out!

DrupalCon Seattle is coming up from April 8-12 2019, and we're featuring some bold new changes to support a variety of audiences from our traditional core of those people who build Drupal, to marketers and content editors, and to the agency sales forces that sell Drupal to the world.

———

As always, we’d like to say thanks to all the volunteers who work with us, and to the Drupal Association Supporters, who make it possible for us to work on these projects. In particular we want to thank:

If you would like to support our work as an individual or an organization, consider becoming a member of the Drupal Association.

Follow us on Twitter for regular updates: @drupal_org, @drupal_infra

Platform.sh: How micro is your microservice?

August 7, 2018 - 23:02
How micro is your microservice? Crell Tue, 08/07/2018 - 16:02 Blog

"Microservices" have been all the rage for the past several years. They're the new way to make applications scalable, robust, and break down the old silos that kept different layers of an application at odds with each other.

But let's not pretend they don't have costs of their own. They do. And, in fact, they are frequently, perhaps most of the time, not the right choice. There are, however, other options besides one monolith to rule them all and microservice-all-the-things.

What is a microservice?

As usual, let's start with the canonical source of human knowledge, Wikipedia:

"There is no industry consensus yet regarding the properties of microservices, and an official definition is missing as well."

Well that was helpful.

Still, there are common attributes that tend to typify a microservice design:

  • Single-purpose components
  • Linked together over a non-shared medium (usually a network with HTTP or similar, but technically inter-process communication would qualify)
  • Maintained by separate teams
  • And released (or replaced) on their own, independent schedule

The separate teams part is often overlooked, but shouldn't be. The advantages of the microservice approach make it clear why:

  • Allow the use of different languages and tools for different services (PHP/MongoDB for one and Node/MySQL for another, for instance.)
  • Allows small, interdisciplinary teams to manage targeted components (that is, the team has one coder, one UI person, and one DB monkey rather than having a team of coders, a team of UI people, and a team of DB monkeys)
  • Allows different components to evolve and scale scale independently
  • Encourages strong separation of concerns

Most of those benefits tie closely to Conway's Law:

Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization's communication structure.

A microservice approach works best when you have discrete teams that can view each other as customers or vendors, despite being within the same organization. And if you're in an organization where that's the case then microservices are definitely an approach to consider.

However, as with any architecture there are tradeoffs. Microservices have cost:

  • Adding network services to your system introduces the network as a point of failure.
  • PointS of failure should always be plural, as a network, even a virtual and containerized one, has many, many points of failure.
  • The network will always be 10x slower than calling a function, even a virtual network. If you're using a shared-nothing framework like PHP you have to factor in the process startup cost of every microservice.
  • If you need to move some logic from one microservice to another it's 10x harder than from one library to another within an application.
  • You need to staff multiple interdisciplinary teams.
  • Teams need to coordinate carefully to avoid breaking any informal APIs
  • Coarse APIs
  • Needing new information from another team involves a much longer turnaround time than just accessing a database.

Or, more simply: Microservices add complexity. A lot of complexity. That means a lot more places where things can go wrong. A common refrain from microservice skeptics (with whom I agree) is

"if one of your microservices going down means the others don't work, you don't have a microservice; you have a distributed monolith."

To be sure, that doesn't mean you shouldn't use microservices. Sometimes that is the right approach to a problem. However, the scale at which that's the is considerably higher than most people realize.

What's the alternative?

Fortunately, there are other options than the extremes of a single monolith and a large team of separate applications that happen to talk to each other. There's no formal term for these yet, but I will refer to them as "clustered applications".

A clustered application:

  • Is maintained by a single interdisciplinary team
  • Is split into discrete components that run as their own processes, possibly in separate containers
  • Deploys as a single unit
  • May be in multiple languages but usually uses a single language
  • May share its datastore(s) between processes

This "in between" model has been with us for a very long time. The simplest example is also the oldest: cron tasks. Especially in the PHP world, many applications have had a separate cron process from their web request/response process for literally decades. The web process exists as, essentially, a monolith, but any tasks that can be pushed off to "later" get saved for later. The cron process, which could share, some, all, or none of the same code, takes care of the "later". That could include sending emails, maintenance tasks, refreshing 3rd party data, and anything else that doesn't have to happen immediately upon a user request for the response to be generated.

Moving up a level from cron are queue workers. Again, the idea is to split off any tasks that do not absolutely need to be completed before a response can be generated and push them to "later". In the case of a queue worker "later" is generally sooner than with a cron job but that's not guaranteed. The workers could be part and parcel of the application, or they could be a stand-alone application in the same language, or they could be in an entirely different language. A PHP application with a Node.js worker is one common pattern, but it could really be any combination.

Another variant is to make an "Admin" area of a site a separate application from the front-end. It would still be working on the same database, but it's possible then to have two entirely separate user pools, two different sets of access control, two different caching configurations, etc. Often the admin could be built as just an API with a single-page-app frontend (since all users will be authenticated with a known set of browser characteristics and no need for SEO) while the public-facing application produces straight HTML for better performance, scalability, cacheability, accessibility, and SEO.

Similarly, one could make a website in Django but build a partner REST API in a separate application, possibly in Go to squeeze the last drop of performance out of your system.

There's an important commonality to all of these examples: Any given web request runs through exactly one of them at a time. That helps to avoid the main pitfall of microservices, which is adding network requests to every web request. The fewer internal IO calls you have the better; just ask anyone who's complained about an application making too many SQL queries per request. The boundaries where it's reasonable to "cut" an application into multiple clustered services are anywhere there is, or can be, an asynchronous boundary.

There is still additional complexity overhead beyond a traditional monolith: while an individual request only needs one working service and there's only one team to coordinate, there's still multiple services to have to manage. The communication paths between them are still points of failure, even if they're much more performance tolerant. There could also be an unpredictable delay between actions; an hourly cron could run 1 minute or 59 minutes after the web request that gave it an email to send. A queue could fill up with lots of traffic. Queues are not always perfectly reliable.

Still, that cost is lower than the overhead of full separate-team microservices while offering many (but not all) of the benefits in terms of separation of concerns and allowing different parts of the system to scale and evolve mostly independently. (You can always throw more worker processes at the queue even if you don't need more resources for web requests.) It's a model well worth considering before diving into microservices.

How do I do either of these on Platform.sh?

I'm so glad you asked! Platform.sh is quite capable of supporting both models. While our CPO might yell at me for this, I would say that if you want to do "microservices" you need multiple Platform.sh projects.

Each microservice is supposed to have its own team, its own datastore, its own release cycle, etc. Doing that in a single project, with a single Git repository, is rather counter to that design. If your system is to be built with 4 microservices, then that's 4 Platform.sh projects; however, bear in mind that's a logical separation. Since they're all on Platform.sh and presumably in the same region, they're still physically located in the same data center. The latency between them shouldn't be noticeably different than if they were in the same project.

Clustered applications, though, are where Platform.sh especially shines. Every project can have multiple applications in a single project/Git repository, either in the same language or different language. They can share the same data store or not.

To use the same codebase for both the web front-end and a background worker (which is very common), we support the ability to spin up the same built application image as a separate worker container. Each container is the same codebase but can have different disk configuration, different environment variables, and start a different process. However, because they all run the same code base it's only a single code base to maintain, a single set of unit tests to write, etc.

And of course cron tasks are available on every app container for all the things cron tasks are good for.

Within a clustered application processes will usually communicate either by sharing a database (be it MariaDB, PostgreSQL, or MongoDB) or through a queue server, for which we offer RabbitMQ.

Mixing and matching is also entirely possible. In a past life (in the bad old days before Platform.sh existed) I built a customer site that consisted of an admin curation tool built in Drupal 7 that pulled data in from a 3rd party, allowed users to process it, and then exported pre-formatted JSON to Elasticsearch. That exporting was done via a cron job, however, to avoid blocking the UI. A Silex application then served a read-only API off of the data in Elasticsearch, and far faster than a Drupal request could possibly have done.

Were I building that system today it would make a perfect case for a multi-app Platform.sh project: A Drupal app container, a MySQL service, an Elasticsearch service, and a Silex app container.

Please code responsibly

There are always tradeoffs in different software design decisions. Sometimes the extra management, performance, and complexity overhead of microservices is worth it. Sometimes it's... not, and a tried-and-true monolith is the most effective solution.

Or maybe there's an in-between that will get you a better balance between complexity, performance, and scalability. Sometimes all you need is "just" a clustered application.

Pick the approach that fits your needs best, not the one that fits the marketing zeitgeist best. Don't worry, we can handle all of them.

Larry Garfield 7 Aug, 2018

Community: Governance Task Force Community Update, August 2018

August 7, 2018 - 22:40

This is a public update on the work of the Governance Task Force.

We have progressed into what we are calling the “Engagement Phase” of our schedule; interviewing community member, working groups, and soliciting feedback and meetups and camp. To date we have interviewed at least 18 people (including community members, liaisons, and leadership,) and 3 groups, with at least 15 more being scheduled.

Interviews

If you would like to participate in an interview, please contact any member of the Governance Task Force or sign up using this Google form.

The purpose of interviews is to meet with people individually to get feedback and ideas, and have a conversation about community governance (non-technical governance.) Many governance related discussions have occurred in the past, but we want to make sure everyone has an opportunity to be heard, since group discussions are not always conducive to individual thoughts. Notes taken during the interview are available to, and editable by, the person interviewed, and not shared outside the Governance Task Force. If you have any concerns about a language barrier, privacy, or any other concerns about participating in an interview, contact us. We will do what we can to work with you.

Analysis

The individual interviews are a new step in this governance process, but we do have access to a lot of information that was already compiled from prior discussions. Many town hall style discussions were held over the past year, and we are using all of that information. As we progress into the “Analysis Phase” we are using that information to develop user stories and ideas that will help inform our eventual proposal. Once the interviews are concluded, their analysis will be merged with the existing information.

Drupal Europe

Rachel, Ela, and Stella will be providing an update on the task force’s efforts at Drupal Europe. Findings will be shared and there will be an open discussion to hear from attendees to inform our efforts.

Ongoing Feedback

The task force is committed to working transparently and delivering a well-rounded proposal for both the community and for leadership to review. We believe the proposal presents a great opportunity to help evolve community governance and inform next steps. Should you want to contact the Governance Task Force, feel free to reach out to any member of the group via Slack, drupal.org, or any public place you find our members.

We’ve also setup a Google form for direct feedback. If you do not want to participate in an interview, but do want to contribute your individual thoughts, use this form. You can answer as many or few questions you like. You can also submit the form anonymously. This form will stay active throughout the proposal process, so if you have thoughts to share at a later date, you can still use this form.

Adam Bergstein
David Hernandez
Ela Meier
Hussain Abbas
Lyndsey Jackson
Rachel Lawson
Stella Power

Ixis.co.uk - Thoughts: Last month in Drupal - July 2018

August 7, 2018 - 22:22
July has been and gone so here we take a look back at all the best bits of news that have hit the Drupal community over the last month. Drupal Development Dries Buytaert discussed why more and more large corporations are beginning to contribute to Drupal. He shares an extended interview with Pfizer Director Mike Lamb who explains why his development team over there has ingrained open source contribution into the way they work. Drupal 8.5.5 was released in July, this patch release for Drupal 8 contained a number of bug fixes, along with documentation and testing improvements.  It was announced that Drupal 8.6.0 will be released on September 5th and the Alpha version was released the week beginning July 16th. The beta was also recently released, the week of July 29th. This release will bring with it a number of new features, Drupal released a roadmap of all the fixes and features they aim to have ready for the new release.  Events Drupal Europe announced 162 hours of sessions and 9 workshops for the event on Tuesday, Wednesday and Thursday. They also urge anyone with any ideas for social events at this year's event to submit your ideas to help fill out the social calendar with community led ideas.  On August 17-19, New York will play host to the second Decoupled Drupal days. For those that don’t know Decoupled Drupal Days gathers technologists, marketers and content professionals who build and use Drupal as a Content Service -- for decoupled front ends, content APIs, IoT, and more.  DrupalCamp Colorado recently took place. The event proved popular as per usual and this year's Keynote “The Do-ocracy Dilemma and Compassionate Contribution”, was delivered by Acquia Director of research and innovation, Preston So. Preston discusses why a more compassionate approach to contribution is so critical when it comes to managing open-source projects, crafting conference lineups, enabling a successful team, and building a winning business. New Modules New modules, updates and projects were of course released throughout July, the pick of the bunch includes: Commerce 8.x-2.8 - E-commerce suite sees a number of bug fixes google_analytics 8.x-2.3 - Module sees a number of bug fixes Drupal 8.5.5 - Patch release that sees a number of bug fixes and testing improvements That is the end of this months round up. Keep an eye out for next months where we cover all the latest developments in the Drupal community and all the important news affecting the wider Drupal community. Miss last months round up? Check it out on the Ixis site now.

Amazee Labs: Transparent Database Sanitization with GDPR-dump

August 7, 2018 - 19:12
Transparent Database Sanitization with GDPR-dump

With GDPR in full effect, sanitization of user data is a fairly hot topic. Here at Amazee we take our clients and our clients’ clients privacy seriously, so we have been investigating several possible approaches to anonymizing data.

In the Drupal world, and the PHP world more generally, there are several options available. Here, though, I’d like to discuss one we think is particularly cool.

Blaize Kaye Tue, 08/07/2018 - 14:12

At Amazee Labs’ Global Maintenance, we work with several different projects per day. We move data from our production to staging and dev servers, and from our servers to our local development environments. Especially on legacy systems, site-specific configuration details often exist only in the databases, and even if that weren’t the case, the issues we’re investigating routinely require that we dig into the database as it (more or less) is on the production servers. Anonymization is crucial for our day to day work.

So our considerations here are, how do we balance productivity while keeping things anonymous?

One way of achieving this is to make Anonymization transparent to the developer. Essentially, we want our developers to be able to pull down the live database as it exists at the moment that they pull it down, and have it be anonymized.

How can we achieve this?

Well, one way is to analyse the daily workflow to see if there are any points at which the data has to flow through before it reaches the developer?

It turns out that, if you’re working with mysql, this “final common path” that the data flows through is the mysqldump utility.

If you’re running backups, chances are you’re using mysqldump.

If you’re doing a drush sql-sync there’s a call to mysqldump right at the heart of that process.

Mysqldump is everywhere.

The question is, though, how do we anonymize data using myqldump?

The standard mysqldump binary doesn’t support anonymization of data, and short of writing some kind of plugin, this is a non-starter.

Fortunately for us, Axel Rutz came up with an elegant solution, namely, a drop in replacement for the mysqldump binary, which he called gdpr-dump. A few of us here at Amazee loved what he was doing, and started chipping in.

The central idea is to replace the standard mysqldump with gdpr-dump so that any time the former is called, the latter is called instead.

Once the mysqldump call has been hijacked, so to speak, the first order of business is to make sure that we are actually able to dump the database as expected.

This is where mysqldump-php comes in. It’s the library on which the entire gdpr-dump project is based. It provides a pure PHP implementation of mysqldump as a set of classes. On its own, it simply dumps the database, just as the native mysqldump cli tool does.

A great starting point, but it only gets us part of the way.

What we’ve added is the ability to describe which tables and columns in the database being dumped you would like to anonymize. If, for instance, you have a table describing user data with their names, email, telephone numbers, etc. You can describe the structure of this table to gdpr-dump and it will generate fake, but realistic looking, data using the Faker library.

This requires some upfront work, mapping the tables and columns, but once it is done you’re able to call mysqldump in virtually any context, and it will produce an anonymized version of your database.

There is still a lot of thinking and work to be done, but we think it’s worth investing time in this approach. The fact that it can be used transparently is its most compelling aspect - being able to simply swap out mysqldump with gdpr-dump and have the anonymization work without having to change any of the dependent processes.

If any of this piques your interest and you’re looking for more details about how you might be able to use gdpr-dump in your own workflow, feel free to check out the project (and submit PRs): https://github.com/machbarmacher/gdpr-dump.

ADCI Solutions: Visual regression testing with BackstopJS

August 7, 2018 - 17:26

The larger a project, the more time you will spend on regression testing after each change. But there are a lot of tools which can help you to reduce efforts for this process. One of them is BackstopJS.

Get acquainted with BackstopJS

OSTraining: The Ultimate Tutorial for Drupal's Paragraphs Module

August 7, 2018 - 00:25

Over the last few months we've worked with more and more Drupal 8 sites. Those projects all had one thing in common ... they used the Drupal Paragraphs module.

Paragraphs is a very popular module for handling content in Drupal 8.

Paragraphs works in a similar way to content fields, but also provides a wide range of options for the design, layout and grouping of your content. 

Jacob Rockowitz: Caring about webform accessibility

August 6, 2018 - 21:00

It is easy to not care about accessibility, mainly because we generally don't see or understand how people with disabilities use our applications. Frankly, even usability testing can become an afterthought when it comes to building websites. There are lots of move parts to a website or an application, and it is hard to pause and ask can someone access this information using just their keyboard and/or a screen reader. The more accessible your website is, the more users you can reach and engage with your website's mission or purpose.

At Design4Drupal in Boston, caring about accessibility became the central theme for my presentation, titled ’Webform Accessibility'. After I gave my presentation, I created Issue #2979628: [meta] Improve Webform Accessibility and started fixing some obvious and not-so-obvious issues with the Webform module for Drupal 8. Andrew Macpherson, one of the Drupal Accessibility Topic maintainers, was kind enough to spend an entire train ride from NYC to Boston discussing (via Drupal Slack) form related accessibility issues and how to resolve them.

There are tools that can show you obvious problems

The most common form accessibility issue I see across the web is a failure to provide descriptive labels for form inputs. Labeling form inputs makes it possible for a screen reader to describe what input value is expected, as well as determine how it’s going to be used. For example, a screen reader needs to be able to identify a website's search box so that users can find content quickly. The solution is to provide a hidden label or title attribute to a site's search...Read More

I Fix Drupal: Debugging &quot;Relay log read failure&quot; With MySQL Master-Slave Replication For Drupal

August 6, 2018 - 16:28
Our Drupal 7 installation has served us well using a single, optimised MySQL database server. However, a desire to deliver advanced reporting and dashboards driven by Power BI required us to implement a replicated slave so that Power BI could draw data from a datasource that would not impact the performance of the database serving the website. Mostly the configuration was straightforward: Replicate a MySQL 5.5 master into a single MySQL 5.7 slave Stitchdata tunnels into MySQL slave to access binary logs - Stitchdata requires at least MySQL 5.6, which explains the version difference you...

Jason Pamental's RWT.io Blog: Drupal 8 Theming Quickstart

August 4, 2018 - 21:46
Drupal 8 Theming Quickstart jpamental Sat, 08/04/2018 - 10:46

This is a full-day workshop diving into the structure of a new them in Drupal 8. We’ll learn about the structure, how to create a theme, add/remove/replace CSSand JS libraries, include external resources like web fonts, and look learn the basics of Twig and the new templating system.

Download the workshop slides here.

 

Workshop Past Events Topics Drupal Planet Drupal

Mediacurrent: The Marketer’s Guide to Drupal 8: Why Open Source is the Right Fit for your Organization

August 4, 2018 - 02:26

We all have heard the debate about Open Source Software and Closed or Proprietary; but what is the real difference?

Simply: 

Open source software is available for the general public to use and modify from its original design free of charge. 

versus

Closed source where the source code is not shared with the public for anyone to look at or change. 

One of the main advantages of open source software is the cost; however, when applied to OSS, the term "free" has less to do with overall cost and more to do with freedom from restrictions. 

90% 
According to Forrester Research 90% the code in a typical application is Open Source.  

For a Closed Source CMS, depending on the choice of software, the cost can vary between a few thousand to a few hundred thousand dollars, which includes a base fee for software, integration and services, and annual licensing/support fees. 

In a 2017 report by Black Duck Software by Synopsys, nearly 60% of respondents said their organizations’ use of open source increased in the last year citing: 

  • cost savings, easy access, and no vendor lock-in (84%)
  • ability to customize code and fix defects directly (67%)
  • better features and technical capabilities (55%)
  • the rate of open source evolution and innovation (55%).

 

1M+ 

Websites across every industry vertical —from Georgia.gov to Harvard— trust Drupal as a secure open source CMS platform. 

Open Source Software has a long-term viability and is always on the cutting edge of technology.  Selecting technologies means committing to solutions that will support an active, growing business over the long term, so it requires careful consideration and foresight.  Here are some of the benefits of open-source to consider:

1. Value for cost 
       Worried about your marketing budget when looking at changing your CMS? Open source software has no licensing fees! It’s free, which means room to spend your $ on other initiatives.
2. Added Security
       Open source - means community. The more people (developers) looking at the source code, the more fixes and regular updates will be available. What can sometimes takes weeks or months to resolve with proprietary software takes just hours or days with open source. This will help give your marketing team a piece of mind knowing that if you don’t have time to look at the code of your site - or you don’t know how - then there are developers all over the world continuously checking for bugs & fixes.
3. Customizability
       Have a really customized idea for your site that you’ve never seen elsewhere? Open Source can help. By customizing the code to fit your needs, it provides a competitive advantage for your business.
4. Flexibility
       Open-source technology naturally provides flexibility to solve business problems. Your team and organization can be more innovative, and it gives you the ability to stay on the cutting edge of latest trends & designs.
5. Integrations
       With open source, especially Drupal, you can Integrate the best-of-breed marketing technology. It is architected for easy integration with your tools - Marketing automation, email service providers, CRM, etc… Drupal 8 gives you the foundation for your digital experience ecosystem.
6. Speed
       This isn’t just about site speed, but the ability to get your site up and running - with full marketing capabilities - on time & within budget. Open source allows you to deliver value right away.
7. Scalability 
       Drupal and other open source platforms give you the advantage of being able to scale your digital presence for the future. You're not confined to stick with what you already have. You can continue to evolve and build long-term with open source.

The Benefits of Open Source can go on for pages but it’s important when evaluating your options to think about your business and its goals. Once consistent need we see is having access to a CMS that is easy for you and your team to manage on a day-to-day basis.

In the next blog of the series - we’ll hear from the Associate Director of Digital Marketing at Shirley Ryan Abilitylab, about how he is leveraging open source - particularly Drupal - to achieve his business goals. 

Mediacurrent: The Marketer’s Guide to Drupal 8: Why Open Source is the Right Fit for your Organization

August 4, 2018 - 02:26

We all have heard the debate about Open Source Software and Closed or Proprietary; but what is the real difference?

Simply: 

Open source software is available for the general public to use and modify from its original design free of charge. 

versus

Closed source where the source code is not shared with the public for anyone to look at or change. 

One of the main advantages of open source software is the cost; however, when applied to OSS, the term "free" has less to do with overall cost and more to do with freedom from restrictions. 

90% 
According to Forrester Research 90% the code in a typical application is Open Source.  

For a Closed Source CMS, depending on the choice of software, the cost can vary between a few thousand to a few hundred thousand dollars, which includes a base fee for software, integration and services, and annual licensing/support fees. 

In a 2017 report by Black Duck Software by Synopsys, nearly 60% of respondents said their organizations’ use of open source increased in the last year citing: 

  • cost savings, easy access, and no vendor lock-in (84%)
  • ability to customize code and fix defects directly (67%)
  • better features and technical capabilities (55%)
  • the rate of open source evolution and innovation (55%).

 

1M+ 

Websites across every industry vertical —from the White House to Harvard— trust Drupal as a secure open source CMS platform. 

Open Source Software has a long-term viability and is always on the cutting edge of technology.  Selecting technologies means committing to solutions that will support an active, growing business over the long term, so it requires careful consideration and foresight.  Here are some of the benefits of open-source to consider:

1. Value for cost 
       Worried about your marketing budget when looking at changing your CMS? Open source software has no licensing fees! It’s free, which means room to spend your $ on other initiatives.
2. Added Security
       Open source - means community. The more people (developers) looking at the source code, the more fixes and regular updates will be available. What can sometimes takes weeks or months to resolve with proprietary software takes just hours or days with open source. This will help give your marketing team a piece of mind knowing that if you don’t have time to look at the code of your site - or you don’t know how - then there are developers all over the world continuously checking for bugs & fixes.
3. Customizability
       Have a really customized idea for your site that you’ve never seen elsewhere? Open Source can help. By customizing the code to fit your needs, it provides a competitive advantage for your business.
4. Flexibility
       Open-source technology naturally provides flexibility to solve business problems. Your team and organization can be more innovative, and it gives you the ability to stay on the cutting edge of latest trends & designs.
5. Integrations
       With open source, especially Drupal, you can Integrate the best-of-breed marketing technology. It is architected for easy integration with your tools - Marketing automation, email service providers, CRM, etc… Drupal 8 gives you the foundation for your digital experience ecosystem.
6. Speed
       This isn’t just about site speed, but the ability to get your site up and running - with full marketing capabilities - on time & within budget. Open source allows you to deliver value right away.
7. Scalability 
       Drupal and other open source platforms give you the advantage of being able to scale your digital presence for the future. You're not confined to stick with what you already have. You can continue to evolve and build long-term with open source.

The Benefits of Open Source can go on for pages but it’s important when evaluating your options to think about your business and its goals. Once consistent need we see is having access to a CMS that is easy for you and your team to manage on a day-to-day basis.

In the next blog of the series - we’ll hear from the Associate Director of Digital Marketing at Shirley Ryan Abilitylab, about how he is leveraging open source - particularly Drupal - to achieve his business goals. 

Ashday's Digital Ecosystem and Development Tips: Using the Google Natural Language API with Drupal

August 4, 2018 - 02:00

If you haven’t heard of the phrase “Natural Language Processing” by now, you soon will. Natural Language processing is an expanding and innovative use of technology to analyze large amounts of data or content and derive meaning from it, short-cutting a tremendous amount of manual effort needed to do that ourselves. It’s been around in some form for quite a while, but it was often relegated to complex enterprise systems or large corporations with a vested interest in automating the data mining of huge amounts of data to figure out what the patterns were, for example, in consumer purchasing trends or social media behavior. It’s a cool idea (it’s a form of artificial intelligence after all) and fuels a lot of our online experience now whether it’s product recommendations, content recommendations, targeted ads, or interactive listening services like Siri or Alexa. What’s even better is that this sort of thing is becoming more and more accessible to use in our own software solutions as many of these now provide services with APIs. This allows us to provide a more personalized or meaningful experience for site visitors on web projects that likely don’t have the budget or requirements to justify attacking natural language processing itself and can instead find accessible ways to benefit from the technology.

OpenSense Labs: Top 2018 Drupal Modules using Artificial Intelligence

August 3, 2018 - 15:20
Top 2018 Drupal Modules using Artificial Intelligence Akshita Fri, 08/03/2018 - 13:50

Considering the spectacular potential AI has, no wonder it one of the most-sought-after trends in the businesses today. You might be employing it in your online business already in the form of chatbots offering smarter features to the users. 


Want to employ artificial intelligence on your Drupal website but don't know where to start from? In this blog, we will share some cool Drupal modules for you to innovate and be part of the current trends.

Top Drupal Modules Using Artificial Intelligence
  1. Intelligent Content Tools

    The Intelligent content tools module is developed by OpenSense Labs with the popular scope to merge Natural Learning Processing with Drupal 8 to help with:
     
    • Auto-tagging: The Auto-Tag Module assists with tags. It analyzes the content on the website with the field specified and then it tags the content by extracting text from that field. The module will automatically tag the articles with tags generated from the content and so it’ll be a lot easier for the users to look for contents with the same tags. 
       
    • Text Summarization: It summarizes the text in the field. For a media website, it is important that the teaser shows just the right information. The text summarizer helps in writing the summary doesn't matter how big the number is.
    • Duplicacy in content: And while the number of plagiarism checkers available in the market is high, Duplicity Rate Module tells you if there is any duplicate content available on the website. It is an intelligent agent module based on Natural Language Processing. It’s a helpful tool for a website designer and content editor. 

      Available for - Drupal 8 | Not covered by Security Advisory

      Read more about How NLP & Drupal Can Provide The Best User Experience
       
  2. Cloudwords for Multilingual Drupal

    No matter how popular your enterprise is, nothing can beat the essence of familiarity that native tongue/language brings. Built for marketers, by marketers, Cloudwords for Multilingual Drupal gives you the power to deliver global campaigns and localized assets with speed and scale. This is the marketing opportunity that no e-commerce business should miss out on.

    Cloudwords can onboard any language provider in a manner of minutes and help you deliver globally consistent, high quality and timely content to multiple countries in many languages. Its smart AI allows the workflow automation and powerful project management capabilities to select the content you want to localize and Cloudwords does the rest. 

    Its CAT tool has a smart internal translator which leverages artificial intelligence and machine learning to significantly increase the productivity.

    It can automatically extract all the content and create project request in Cloudwords. All content is automatically represented in the translation industry’s XLIFF format making it simple to process and interoperable with a wide variety of tools.

    Available for - Drupal 8 | Stable Release

  3. Azure Cognitive Services API

    Azure Cognitive Services API module can seamlessly integrate with intelligent features and with the use of Machine Learning, Artificial Learning, and Natural Learning Process can detect speech, facial and vision recognition other than identifying the sentiments. 

    It offers four different modules for each feature:

    • Face API Module: It integrates with Microsoft Face API, a cloud-based service that and helps identify face detection with attributes and face recognition. It can: 

      1. Detect human faces and compare similar ones
      2. Organize images into groups based on similarity
      3. Identify previously tagged people in images
    • Emotion Recognition API Module: Although in beta, it takes an image as an input, and returns the confidence across a set of emotions for each face in the image, as well as bounding box for the face, from the Face API. 
       
    • Computer Vision API Module: It extracts information from images to categorize and process visual data – and machine-assisted moderation of images to help curate your services.
       
    • Azure Text Analytics API Module: Text Analytics API is a cloud-based service that provides advanced natural language processing over raw text, and includes three main functions: sentiment analysis, key phrase extraction, and language detection.

      Available for - Drupal 8 | Stable Release
  4. Acquia Lift Connector

    Looking for a module that can help you merge content and customer data into one tool? Acquia Lift helps you deliver the most cohesive and personalized experiences across multiple channels and devices.

    With the ability to target audiences in real-time, marketers are able to scale their web personalization efforts in order to drive conversions and bottom-line results. This module provides integration with the Acquia Lift Service and an enhanced User Experience for Personalization, Testing and Targeting directly on the front-end of your website.

    Features

    • Drag-and-drop UI for content targeting

    • Unified customer profile

    • Merging anonymous and known visitor profile

    • Content distribution

    • Real-time, adaptive segmentation

    • Behavioral targeting and A/B testing

      Available for - Drupal 8 | Stable Release

  5. Quora - Related Questions / Posts

    One of the old school marketing techniques involves promoting content through various open/public channels. The Quora Module helps provides related questions/posts from Quora in your Drupal website in a Block.

    The smart intelligence uses tags (provided by the fields of one of the content type) to fetch the relevant Quora questions. The field acting as the interconnection between your Drupal website and Quora is easily configurable.

    This module uses:

    • Google's Custom Search Engine (CSE) API - to fetch Quora questions/posts 

      Available for - Drupal 8 | Version not stable 
      Drupal 7 | Stable Release

  6. Automatic Alternative Text

    The basic principle at work here is the idea of easy perceivability. Any and every information should be, thus, presented in such a way that is easily perceivable to the user. It is required for any non-text information like images and video to describe the content in the form of text for the screen readers to read it. 

    The Automatic Alt text module automatically generates an alternative text for images when no alt text has been provided by the user. This module works great for the websites and portals with user-generated content where the users may even not be aware of the purpose and importance of the Alternative text. 

    It describes the content of the image in one sentence but it doesn’t provide face recognition. 

    Available for - Drupal 8 | Stable Release

    Read how Drupal is Ensuring the Web Accessibility Standards

  7. Chatbot API

    In the era of Personal Assistants like Alexa, Google Home, Chatbots are the cool fad for your website. The Chatbot API module creates a common layer serving Drupal content to any of the Personal Assistants services in the market.

    However, the module has the dependency on other modules like Drupal Alexa. Chatbot API by itself doesn't do anything. You should install it only if another module asks for it or you want to build your own integration driver.

    Available for - Drupal 8 | Not covered by Security Advisory 

  8. OpenCalais

    An integration with Thomson Reuters' Calais web service with Drupal, OpenCalais creates rich semantic metadata for the content you submit – in well under a second. Using NLP, Calais analyzes the document and finds the entities within it. The metadata returned can be automatically assigned to vocabularies, or it can only suggest terms allowing full user control of the tagging.

    Goind beyong the classic entity identification, Calais returns the facts and events hidden within your text as well. 

    Available for - Drupal 7 | Stable Release

  9. SendPulse

    The SendPulse module provides integration with SendPulse, which is an Integrated Platform for Web and Push Notifications. It assists greatly when revamping your marketing and focussing on the mails and SMS. It claims to assist in providing 60% open rates with the use of its Artificial intelligence. 

    It also heavily relies on personalization in order to achieve the numbers. The AI monitor’s the user behavior such as email opening hours, user communication preference, time zone, content among others. Once the information has been collected it’s AI predicts the medium and hour for the best results. 

    Features:

    • SendPulse Mailing Lists

    • Pulling and updating mailing lists from SendPulse using cron

    • Mass export users from Drupal to SendPulse

    • Map Drupal user fields with SendPulse mailing lists variables

    • Creating new SendPulse Mailing Lists from Drupal admin interface

      Push Notifications

    • Auto-creating new Push Notification and sending to subscribers after creating a new node

    • You can control for which node types after creating new node should create push notifications

    • Form for sending a single Push notification.

      Available for - Version 7 | Stable Release

Artificial Intelligence, Machine Learning have a lot to offer and have the potential to improve user interactiveness in your website. The use cases of AI and ML can range from personalization to innovation to communication.

Looking for a smart solution for your business? Contact us at hello@opensenselabs.com

blog banner blog image Drupal module Artificial Intelligence Machine Learning Chatbot Personalization Drupal 8 Drupal Personalization Smart Content Blog Type Tech Is it a good read ? On

OSTraining: The Animation Module in Drupal 8

August 3, 2018 - 11:16

There are multiple JavaScript and CSS libraries on the internet. They allow you to animate certain parts of your site and make them look more attractive.

The Animations module in Drupal makes use of the three useful libraries:

  • animate.css
  • typed.js
  • WOW.js

They give the elements inside your site some extra cool features and make it more appealing.

In this tutorial, you will learn how to install the required libraries and the module and take a look at its basic usage.

Let’s start!

Jeff Geerling's Blog: NEDCamp 2018 - Keynote on DevOps

August 3, 2018 - 05:34

Over the past decade, I've enjoyed presenting sessions at many DrupalCamps, DrupalCon, and other tech conferences. The conferences are some of the highlights of my year (at least discounting all the family things I do!), and lately I've been appreciative of the local communities I meet and get to be a part of (even if for a very short time) at Drupal Camps.

The St. Louis Drupal Users Group has chosen to put off it's annual Camp to 2019, so we're guiding people to DrupalCorn Camp, which is only a little bit north of us, in Iowa.

Tandem's Drupal Blog: Lando + Envoy

August 2, 2018 - 23:34
August 03, 2018 Learn to automate deploy steps with envoy on affordable hosting. Why Envoy Envoy is a task runner put together by the Laravel team. When I can I like to use hosts like Pantheon and Platform.sh that give me a wealth of tools, containers in production, and a lot less DevOps headaches when something like Heartbleed happens &#x1F631;...