Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Accent Problem - Pentaho User console 6

$
0
0
The User console, reports and whatsoever thing don't encode european accent character.

I put utf8 everywhere but it doesn't change anything.

I followed this guide to solve the problem: http://pedroalves-bi.blogspot.it/201...acters-in.html

but after i put that strings, pentaho couldn't start again ( so I had to delete the modifications ).

Any help? Thx


Pentaho is running over CentOs 7. In CentOs the locale is set to LANG=en_US.UTF-8

Inserted value not seen with database lookup in same transformation

$
0
0
Hi guys,
I have three steps occuring in the following order. The main prurpose is to generate an accumulated table measure:
1. lookup the accumulated table for up-to-now summed value
2. a calculator step which does the following sum : (previous looked up value from accumulated table + new value for read from previous steps)
3. update/insert step to write to the accumulated table the new summed value.

However after the transformation finished the accumulated value is the last value read. for example 1100, 2000, 3000 values must be summed and written to the accumulated table but the result is 1100. When I look at the logs I see "No result found after database lookup! (add defaults)" for all the values looked up. So the default 0 value is returned. Why am i getting the defaults do you think? I have cache disabled for lookup step and comit interval 1 for insert/update step.


Thanks in advance.

How to connect the Mondrian server purely on client side (XML/A access by Ajax)?

$
0
0
I found the Mondiran documentation to access the XML/A and setups. But I don’t find the detailed information to access the XML/A directly from the Mondrian server. So how can access the XMLA in Mondrian server using Ajax call like as SSAS? Is there any connection parameter to connect Mondrian BI server through Ajax as below?

I am accessing the XMLA using Ajax call using JavaScript by creating msmdpuml.dll to SSAS (SQL Server Analysis Services), my JavaScript code for accessing XML/A from SSAS using Ajax call is,

Code:



var MDX = " SELECT  {[Date].[Fiscal]  ON COLUMNS ,  {[Measures].[Customer Count]}  ON ROWS  FROM [Adventure Works] ;           
$.ajax({
                type: "POST",
                url: "http://localhost:6078/olap/msmdpump.dll",
                data: "<Envelope xmlns=\"http://schemas.xmlsoap.org/soap/envelope/\"> <Header></Header> <Body> <Execute xmlns=\"urn:schemas-microsoft-com:xml-analysis\"> <Command> <Statement> " + MDX +" </Statement> </Command> <Properties> <PropertyList> <Catalog>Adventure Works DW Standard Edition</Catalog> </PropertyList> </Properties> </Execute> </Body> </Envelope>",
                success: function (responce, textStatus, jqXHR) {
                                                                $('body').append(responce);
                },
                contentType: "text/xml",
                dataType: "xml",
            });

Text File too big in export to text

$
0
0
Hello, I have one issue on my transformations... I have to make a text output from a xml, well when I use my transformation (it haves just TABLE INPUT and TEXT OUTPUT) it gives me a text with 9,53 MB and when I copy and paste the xml from the database (using the same query) my text file passes to 20 Kb
I dont know what to do to fix this...
The image of my transformation:

form.png
Attached Images

Community Startup Tabs doesn't work

$
0
0
I am trying to configure CST, however, it continues to open the BI Server HOME page after logging into the application.


This is how my configuration file looks like:

Code:

<?xml version="1.0"?>
<cstConfig>


  <rule match="USER" pattern="false" value="fatourbanismo" mode="launcher">
    <tab title="Dashboard FATO Urbanismo" order="1" fullScreen="false" tooltip="Dashboard FATO Urbanismo">/pentaho/api/repos/%3Ahome%3Afatourbanismo%3AFATO_URBANISMO%3AStart%3Astart.wcdf/generatedContent</tab>
  </rule>


  <rule match="ROLE" pattern="false" value="Power FATOURBANISMO" mode="launcher">
    <tab title="Dashboard FATO Urbanismo" order="1" fullScreen="false" tooltip="Dashboard FATO Urbanismo">/pentaho/api/repos/%3Apublic%3AFATO_URBANISMO%3AStart%3Astart.wcdf/generatedContent</tab>
  </rule>
 
</cstConfig>


I am using the BA Server 7.0.0.0.25. I installed CST through Marketplace and it is located in the folder: pentaho-solutions/system/cst. The config file is in pentaho-solutions/system/cst/resources/config.xml.

Best regards.

Excel Import POI OLE2 error: Invalid header signature; read 0x74, expected 0xE1

$
0
0
Hi,

I'm attempting to read folder full of xlsm files. However, 2 of the 21 XLSM files I'm trying to read present messages like this:

Invalid header signature; read 0x74776572646E610E, expected 0xE11AB1A1E011CFD0 - Your file appears not to be a valid OLE2 document

Clearly there is something not right about the file headers. I have tried re-saving the files, and saving as xlsb and re-saving as xlsm without success. Any thoughts as to what I can do to correct this behavior? I can think about stripping out the vba content and saving these as xlsx files but I feel like this shouldn't be necessary.

I've read a stack overflow post here which suggests that adding some specific Jar files to my Java path might a way to resolve this.

http://stackoverflow.com/questions/1...mework-in-java

However on that note I worry about my Java path not being set up right. I have this message when loading PDI:

pentaho_java.jpg

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

I think that this is my java path info:

Microsoft Windows [Version 6.3.9600]
(c) 2013 Microsoft Corporation. All rights reserved.


C:\WINDOWS\system32>for %i in (java.exe) do @echo. %~$PATH:i
C:\ProgramData\Oracle\Java\javapath\java.exe


C:\WINDOWS\system32>


- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
This is what I currently have set for java path in my spoon.bat file:



REM Special console/debug options when called from SpoonConsole.bat or SpoonDebug.bat
if "%SPOON_CONSOLE%"=="1" set PENTAHO_JAVA=C:\Program Files\Java\jre1.8.0_112\bin\java.exe
if not "%SPOON_CONSOLE%"=="1" set PENTAHO_JAVA=C:\Program Files\Java\jre1.8.0_112\bin\javaw.exe
set IS64BITJAVA=0


call "%~dp0set-pentaho-env.bat"
Attached Images

Unable to connect to Oracle Database from Pentaho

$
0
0
I am unable to connect to my Oracle Database using Pentaho. Please see screenshot of error.
PentahoOracleError.jpg

For the database name, this is the full syntax I used:

(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=localhost)(PORT=1521)))(CONNECT_DATA=(SERVICE_NAME=PDBORCL)))

Asking for assistance please.
Attached Images

How to Integrate or Embed Pentaho Business Intelligence into your application

$
0
0
The Pentaho BA Server is highly extensible, embeddable, and scriptable business intelligence web application. Internally, it is composed of content-generating engines that provide reporting, online analytical processing (OLAP), and data integration (ETL) functionality. These engines are managed by the Pentaho BI Platform process-flow engine, which as designed for but not exclusive to running such business intelligence tasks as: retrieving data from multiple disparate data sources, creating data-driven reports, and other content, scheduling, and conditionally automating content delivery.


End users interact with these services through an ad-hoc reporting interface, a choice of OLAP visualization tools (Pentaho Analyzer or JPivot), a dashboard designer, and a convenient scheduling interface.


The BI Platform is a process-flow engine that forms the operational core of the BA Server. It ties the other content engines — Reporting, Analysis, and Data Integration — to the Pentaho User Console to provide content display, delivery, and scheduling functionality. Additionally, the BI Platform offers a powerful scripting framework for conditionally automating tasks. Typically, the BI Platform is used to automate business intelligence tasks that rely on other engines, but it could theoretically be used for practically any logical task.


Embedding refers to adding individual pieces of Pentaho BA Server into your application on a code level without having to run a separate BA Server instance.


Integrating refers to running a fully operational Pentaho BA Server instance in order to access its content and use its functionality. This scenario accomplishes mosts or all of the goals of embedding BA Server.


Embedding is what most Pentaho users think that they should do. However, the interconnected nature of the BA Server prevents this from being a quick and easy process. Embedding is a highly complex operation that requires skilled programmers and lots of developer support resources from Pentaho's services and engineering teams. Integrating accomplishes most or all of the goals of embedding without the development overhead. For example, instead of embedding Interactive Reporting into your application, you can simply display it in an iframe. Page 22 through page 28 of Integrating Pentaho Software and Content provides example for running Pentaho Analyzer, Interactive Reporting, Adhoc Reporting in an iframe, and seems to provide hooks / callbacks so that Pentaho can invoke your logic when certain event occur such as when the user save a report. All of the Interactive Reporting functionality that you get through the Pentaho User Console will be available to you when you integrate in this manner.


The BI Platform is a lightweight process-flow engine that defines the order of execution of one or more components of the Pentaho BI Platform. It does not generate content itself, so it has to rely on separate engines for creating reports and other content.


Pentaho relies on the Spring Security pluggable authentication framework. By default, the BA Server uses a JDBC based data access object that is tied to a Hibernate database. Users and roles are configured through the Pentaho Enterprise Console, and content authorization is controlled by the BA Server administrator. However, you can easily configure the server to use existing security tables in a different database, or to authenticate through your existing LDAP server or Central Authentication Service. Pentaho's security is also extensible to the point that you can create your own data access object, or completely remove all authentication functionality.


The BA Server is programmable directly through Java code, or dynamically through XML files known as "action sequences". An action sequence contains a set of actions and parameters (input, output, and external resources) in an XML file with a .xaction extension. For this reason, action sequences are sometimes called xactions.


Action sequences activate BI Platform components and compel them to do work. They can be run anytime while the BA Server is active. You can also arrange for action sequences to be run at certain times, or intervals through the built-in scheduling service.


The simplest methods of integrating content generated from Pentaho BA Server are:


Direct services provided via servlets that deliver content in the URL outputstream
Web services provided via servlets that deliver content packaged as a SOAP response.
The Pentaho BA Server is capable of serving content using only a definition file. For example, a report created with Report Designer can be served from the BA Server directly from the .prpt definition file. This applies to Analyzer reports, Dashboard Designer views, and Pentaho Data Integration results as well. The BA Server is also capable of processing several steps — or actions — sequentially and returning the resulting output or content.

Source: wikidot

Groovy script and user id

$
0
0
hello,

I have a few groovy scripts, that I use inside the BI Server. That works fine. But now I need to be able to find out which user is logged in and make decisions based on that.

Would somebody know how to get the userid from the session? Maybe a code snippet?

Thanks,

Uwe

Login error on SSL

$
0
0
Hi everyone,

I'm getting an error on login "Login Error. A login error occured. Please try again" while using https. In the browser console I can see the error

"The page at 'https://<domainName>/pentaho/Login' was loaded over HTTPS, but displayed insecure content from 'http://<domainName>/pentaho/index.jsp': this content should also be loaded over HTTPS."The same works fine when using http.

In webcontext.js, the SERVER_PROTOCOL is always "http" and FULLY_QUALIFIED_URL is also on http.

What could be the issue.




How to execute a query from within "Query Scripting" initQuery function

$
0
0
Hi,

I try to use "Query Scripting" in order to add more fields dynamically to my query. But since the new field names should be obtained executing another query, the question is how can I execute this other query already present in the report. I'm using ECMAScript but it's ok if I must switch to Groovy. There are some global vars like resourceManager, contextKey, dataFactory,... but I cannot find any doc about them.

I have already looked throught the forum and googled for info about "Query Scripting" but I couldn't find any decent documentation.

Hope somebody can help me pointing to some useful resource.

Than you.

How to tell if a "run SSH" command fails?

$
0
0
Hi,

I have been working with Pentaho for a while and now i come to a frustrating point.
I want to know if a SSH command failed or not. first of all I tried to use the stdErr output that comes out of the box in the "Run SSH Commands" option, but as a matter of fact it is just returning if there is sdtErr or not from the command i used.
When i use it for example a sqoop command, and stdOut will go also to the stdErr and by that make a false alarm of a failure.

then i tried to use the exit code provided by the bash scripting, for example:
cd FakeFolder ; echo $?
-bash: cd: FakeFolder: No such file or directory
1
and then retrieve the exit code (in that case 1) and see if it is different from 0. But - sometimes pentaho will "push" the exit code, or basically any command after the original one to be written befor the stdOut of the original command.


now i am completely out of hand - > how can i get the true exit code, or a true state of the ssh command of been successful or not?

Deploying to multiple environnements

$
0
0
Hi all,

we use three differents environnements : DEV, TEST and PROD. Each one comes with its own database used in PDI (TEST use the DB_TEST, DEV use DB_DEV and PROD use DB_PROD).

Currently, when we deploy from DEV to TEST or PROD we have to manually modify the connector parameters to define the good database.

Is there a way to avoid the need to change the connector paramater, for example using environnement variable or a property file ?


Thanks,
Richard.

How to execute a Sql Statement Select inside a field and get the select data?

$
0
0
I have a table with a field called 'query' containing 'select cdclient, nmclient, dtRegister from tb_client', and another field called 'parameters' containing 'where cdclient = 12345'.
I created a new field called 'Sql' containing the concatenation of the 'query' + 'parameters' fields (forming a complete Select Sql Statement).
Now I want to execute the Select Sql Statement inside the field 'Sql', get the return from select (the output fields) and generate an excel file.

Exporting and importing transformation between repositories gives error re distribute

$
0
0
I export a transformation from one db repository then import it into another. When I try to save the new transformation I get:
ERROR: column "distribute" is of type boolean but expression is of type character varying

I get the same error if I copy the transformation to the clipboard and paste it in the other repository.
The XML contains many instances of <distribute>Y</distribute>, which would be whether the step should distribute rows to the next step.

I have tried changing the distribute values (in the XML) to 0 and 1, also true and false.

How can I get the transformation copied properly between the two repositories?

(I am using GeoKettle 2.5)

No suitable driver found for jdbc:hive - Spoon Database connections to Hadoop Hive

$
0
0
PDI Database connections to Hadoop Hive failing with message no suitable driver found for jdbc:hive:

I have installed Pentaho BA Server 6.1 on Windows environment and Hortonworks Sandbox 2.4 on VMware Workstation.

Pentaho and HDP2.4 integration is working fine for all the component except Hive.

When I try to create database connection to Hadoop Hive in PDI Spoon, I get the following error.

I tried copying hive-jdbc.jar (downloaded from HDP2.4 location /usr/hdp/2.4.0.0-169/hive/lib) to the locations in Pentaho (C:\Pentaho\server\biserver-ee\tomcat\lib and C:\Pentaho\server\data-integration-server\tomcat\lib) and this does not work and I get the same error. Any suggestion on this would be helpful.

org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database


Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)
No suitable driver found for jdbc:hive://sandbox.hortonworks.com:10000/default




at org.pentaho.di.core.database.Database.normalConnect(Database.java:466)
at org.pentaho.di.core.database.Database.connect(Database.java:364)
at org.pentaho.di.core.database.Database.connect(Database.java:335)
at org.pentaho.di.core.database.Database.connect(Database.java:325)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseExplorerController.createDatabaseNodes(XulDatabaseExplorerController.java:380)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseExplorerController.init(XulDatabaseExplorerController.java:130)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.initialize(AbstractXulDomContainer.java:141)
at org.pentaho.ui.xul.swt.SwtXulRunner$1.run(SwtXulRunner.java:67)
at org.eclipse.swt.widgets.Synchronizer.syncExec(Unknown Source)
at org.eclipse.swt.widgets.Display.syncExec(Unknown Source)
at org.pentaho.ui.xul.swt.SwtXulRunner.initialize(SwtXulRunner.java:64)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseExplorerDialog.open(XulDatabaseExplorerDialog.java:92)
at org.pentaho.di.ui.core.database.dialog.DataOverrideHandler.explore(DataOverrideHandler.java:84)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:43)
at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:137)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.di.ui.xul.KettleDialog.show(KettleDialog.java:88)
at org.pentaho.di.ui.xul.KettleDialog.show(KettleDialog.java:55)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:116)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:60)
at org.pentaho.di.ui.job.entry.JobEntryDialog.showDbDialogUnlessCancelledOrValid(JobEntryDialog.java:270)
at org.pentaho.di.ui.job.entry.JobEntryDialog$EditConnectionListener.widgetSelected(JobEntryDialog.java:373)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.job.entries.sql.JobEntrySQLDialog.open(JobEntrySQLDialog.java:430)
at org.pentaho.di.ui.spoon.delegates.SpoonJobDelegate.editJobEntry(SpoonJobDelegate.java:259)
at org.pentaho.di.ui.spoon.Spoon.editJobEntry(Spoon.java:8619)
at org.pentaho.di.ui.spoon.job.JobGraph.editEntry(JobGraph.java:2892)
at org.pentaho.di.ui.spoon.job.JobGraph.mouseDoubleClick(JobGraph.java:636)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1347)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7989)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9269)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:662)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)
No suitable driver found for jdbc:hive://sandbox.hortonworks.com:10000/default


at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:579)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:450)
... 63 more
Caused by: java.sql.SQLException: No suitable driver found for jdbc:hive://sandbox.hortonworks.com:10000/default
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:565)
... 64 more

Getting full log content from within a job

$
0
0
Hi,

I want to parse and process all log entries from inside a job - including all its sub-jobs, sub-transformations, etc. More or less the full log I'd get as an output using the kitchen command line script parameter.

What I've tried so far was using the UDJC step to get hold of the root job channel ID and then getting all its children like this:

Code:

// Get the root job log interface
LoggingObjectInterface logObjInterface = getTrans().getParent();
while (logObjInterface.getParent() != null) {
    logObjInterface = logObjInterface.getParent();
}

// Get all log events
LoggingRegistry loggingRegistry = LoggingRegistry.getInstance();
List childChannelIds = loggingRegistry.getLogChannelChildren(logObjInterface.getLogChannelId());
List logEvents = KettleLogStore.getLogBufferFromTo(childChannelIds, true, 0, 999999);

But this seems to only get "some" events (mostly from jobs, but no transformations at all), even though the API documenation for KettleLogStore.getLogBufferFromTo says: Get all the log lines for the specified parent log channel id (including all children)

How can I get ALL the log entries, ideally as a String of the job I run, please?

Thanks.

Pentaho reporting performance issue

$
0
0
I have a report having query including joins that results into 2480 rows. Initially I run the report in 5.0.1 stable version but while preview the report stucks at 202 th row. So I installed the latest version i.e 7.0.0. But still I am facing the same issue. It is stuck at 202th row. The query is fine tuned so no changes required in query.Please suggest some solution asap.

Update Step Issue

$
0
0
Hello Team

Got some weird problem . I am using PDI 4.3/postgres DB and using update step to update record.

Problem:-
I have this column data "2017-01-31 17:00:14-08" having data type as "timestamp with time zone" and updating the table by comparing it with "2017-01-31 17:00:14" using "=" comparator.

When i ran the process got the below exception.

ERROR 31-01 23:33:44,210 - Update - Error in step, asking everyone to stop because of:
ERROR 31-01 23:33:44,210 - Update - org.pentaho.di.core.exception.KettleDatabaseException:
Entry to update with following key could not be found: [2017-01-31 17:00:14]


at org.pentaho.di.trans.steps.update.Update.lookupValues(Update.java:133)
at org.pentaho.di.trans.steps.update.Update.processRow(Update.java:327)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:50)
at java.lang.Thread.run(Unknown Source)


INFO 31-01 23:33:44,211 - Update - Finished processing (I=1, O=0, R=1, W=0, U=0, E=1)

We have 10 servers and out of 10 only 1 server is having this issue. Table structure/code is same .Any one find this issue with Update step

Sql-File as Table Input

$
0
0
Hello,

I have a bunch of SQL Files with Select Statments and would like them to work as Input for the Table Input.

Right now I read the Sql-File with "get file names" that sends it to the result set

in the next transformation I use the "Get Files from result" and set a hop to the "Set Variables" Step to create a variable with the path of the file.
but if i put that variable into the Table Input it does not work

is there someone that can help me fix my problem ?
i feel like its not that hard but i cant get my head around


kind regards
hulk
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>